Wednesday, 6 July 2011

Mainframe:


Mainframes (often colloquially referred to as "big iron") are powerful computers used mainly by large organizations for critical applications, typically bulk data processing such as census, industry and consumer statistics, enterprise resource planning, and financial transaction processing.
The term originally referred to the large cabinets that housed the central processing unit and main memory of early computers. Later the term was used to distinguish high-end commercial machines from less powerful units.
Most large-scale computer system architectures were firmly established in the 1960s and most large computers were based on architecture established during that era up until the advent of Web servers in the 1990s. (The first Web server running anywhere outside Switzerland ran on an IBM mainframe at Stanford University as early as 1991. See History of the World Wide Web for details.)
In the 1960s, most mainframes had no interactive interface. They accepted sets of punched cards, paper tape, and/or magnetic tape and operated solely in batch mode to support back office functions, such as customer billing. Teletype devices were also common, at least for system operators. By the early 1970s, many mainframes acquired interactive user interfaces and operated as timesharingcomputers, supporting hundreds of users simultaneously along with batch processing. Users gained access through specialized terminals or, later, from personal computers equipped with terminal emulation software. Many mainframes supported graphical terminals (and terminal emulation) but not graphical user interfaces by the 1980s, but end user computing was largely obsoleted in the 1990s by the personal computer. Nowadays most mainframes have partially or entirely phased out classic terminal access for end-users in favor of Web user interfaces. Developers and operational staff typically continue to use terminals or terminal emulators
Mainframe computer in olden days:

There were several minicomputer operating systems and architectures that arose in the 1970s and 1980s, but minicomputers are generally not considered mainframes. (UNIX arose as a minicomputer operating system; Unix has scaled up over the years to acquire some mainframe characteristics)
Many defining characteristics of "mainframe" were established in the 1960s, but those characteristics continue to expand and evolve to the present day.

Modern mainframe computers have abilities not so much defined by their single task computational speed (usually defined as MIPS — Millions of Instructions Per Second) as by their redundant internal engineering and resulting high reliability and security, extensive input-output facilities, strict backward compatibility with older software, and high utilization rates to support massive throughput. These machines often run for years without interruption, with repairs and hardware upgrades taking place during normal operation.

Robotics:

Robotics is the branch of technology that deals with the design, construction, operation, structural disposition, manufacture and application of robots.Robotics is related to the sciences of electronicsengineeringmechanics, and software. 



The word robotics was derived from the word robot, which was introduced to the public by Czech writer Karel Čapek in his play R.U.R. (Rossum's Universal Robots), which premiered in 1921.

According to the Oxford English Dictionary, the word robotics was first used in print by Isaac Asimov, in his science fiction short story "Liar!", published in May 1941 in Astounding Science Fiction. Asimov was unaware that he was coining the term; since the science and technology of electrical devices iselectronics, he assumed robotics already referred to the science and technology of robots. In some of Asimov's other works, he states that the first use of the word robotics was in his short story Runaround (Astounding Science Fiction, March 1942). However, the word robotics appears in "Liar!"

The word robot was introduced to the public by the Czech writer Karel Čapek in his play R.U.R. (Rossum's Universal Robots), published in 1920. The play begins in a factory that makes artificial people called robots creatures who can be mistaken for humans - though they are closer to the modern ideas of androids. Karel Čapek himself did not coin the word. He wrote a short letter in reference to an etymology in the Oxford English Dictionary in which he named his brother Josef Čapek as its actual originator. All robots are programmed by using Artificial Intelligence:

In 1927 the Maschinenmensch ("machine-human") gynoid humanoid robot (also called "Parody", "Futura", "Robotrix", or the "Maria impersonator") was the first and perhaps the most memorable depiction of a robot ever to appear on film was played by German actress Brigitte Helm) in Fritz Lang's filmMetropolis.
In 1942 the science fiction writer Isaac Asimov formulated his Three Laws of Robotics and, in the process of doing so, coined the word "robotics" 

In 1948 Norbert Wiener formulated the principles of cybernetics, the basis of practical robotics.
Fully autonomous robots only appeared in the second half of the 20th century. The first digitally operated and programmable robot, the Unimate, was installed in 1961 to lift hot pieces of metal from a die casting machine and stack them. Commercial and industrial robots are widespread today and used to perform jobs more cheaply, or more accurately and reliably, than humans. They are also employed in jobs which are too dirty, dangerous, or dull to be suitable for humans. Robots are widely used in manufacturing, assembly, packing and packaging, transport, earth and space exploration, surgery, weaponry, laboratory research, safety, and the mass production of consumer and industrial goods.

Surface Computing


surface computer is a computer that interacts with the user through the surface of an ordinary object, rather than through a monitor and keyboard.
The category was created by Microsoft with Surface (codenamed Milan), the surface computer from Microsoft which was based entirely on a Multi-Touch interface and using a coffee-table like design, and was unveiled on 30 May 2007. Users can interact with the machine by touching or dragging their fingertips and objects such as paintbrushes across the screen, or by setting real-world items tagged with special bar-code labels on top of it.

The Surface is a horizontal display on a table-like form. Somewhat similar to the iPhone, the Surface has a screen that can incorporate multiple touches and thus uses them to navigate multimedia content. Unlike the iPhone, which uses fingers' electrical properties to detect touch, the Surface utilizes a system of infrared cameras to detect input. Uploading digital files only requires each object (e.g. a Bluetooth-enabled digital camera) to be placed on the Surface. People can physically move around the picture across the screen with their hands, or even shrink or enlarge them. The first units of the Surface will be information kiosks in the Harrah's family of casinos.
Also receiving units will be T-Mobile, for comparing several cell phones side-by-side, and Sheraton Hotels and Resorts, which will use Surface to service lobby customers in numerous ways.
The Surface has a 2.0GHz Core 2 Duo processor, 2GB of memory, an off the shelf graphics card, a scratch-proof spill-proof surface, a DLP projector, and 5 infrared cameras as mentioned above. However, the expensive components required for the interface also give the Surface a price tag of between $12,500 to $15,000.

Surface computing is the term for the use of a specialized computer GUI in which traditional GUI elements are replaced by intuitive, everyday objects. Instead of a keyboard and mouse, the user interacts directly with a touch-sensitive screen. It has been said that this more closely replicates the familiar hands-on experience of everyday object manipulation.
Early work in this area was done at the University of Toronto, Alias Research, and MIT. Surface work has included customized solutions from vendors such as GestureTek, Applied Minds for Northrop Grumman,. Major computer vendor platforms are in various stages of release: the iTable by PQLabs, Linux MPX, the Ideum MT-50, and Microsoft Surface.


Surface computing is slowly starting to catch on and is starting to be used in real world applications. Here is just a sample of what surface computing technologies have been used.
The Microsoft Surface is starting to pick up popularity and has been used in various places and venues. AT&T became the first retailer to use Surface to help their customers purchase phones. Customers could place the phones on the Surface and receive full phone specs, as well as pricing. It has also been used in a wide variety of locations which include hotel lobbies, such as Sheraton Hotels, as well as venues which included Super Bowl XLIII to help police organize and monitor the event in great detail. It is also starting to gain use in the broadcasting industry and has been used by MSNBC during the 2008 US Presidential Elections. However, USD $15,500 (device only) is still considered expensive for most businesses.
There are other new surface computing applications that are still being developed, one of which is from the MIT Media Lab where students are developing wearable computing systems that can be used on almost any surface. The name of this device is SixthSense.

Innovation:

The term innovation derives from the Latin innovatio, the noun of action from innovare. The word first came into modern use in 1540 and stems from the Latin innovatus, pp. of innovare "to renew or change," from in- "into" + novus "new". Although the term is broadly used, innovation generally refers to the creation or improvement of products, technologies, or ideas. Innovation is distinguished from renovation in that innovation generally signifies a substantial change or difference versus more incremental changes. 


Innovation is an important topic in the study of economicsbusinessentrepreneurshipdesigntechnologysociology, and engineering. Colloquially, the word "innovation" is often synonymous with the output of the process. However, economists tend to focus on the process itself, from the origination of an idea to its transformation into something useful, to its implementation; and on the system within which the process of innovation unfolds. Since innovation is also considered a major driver of the economy, especially when it leads to new product categories or increasing productivity, the factors that lead to innovation are also considered to be critical to policy makers. In particular, followers of innovation economics stress using public policy to spur innovation and growth.



In the organizational context, innovation may be linked to positive changes in efficiency, productivityquality, competitive partitioning, market share, etc. can all be affected positively by innovative forces. All organizations can innovate, including for example hospitals, universities, and local governments. Some will flourish under its influence. Other will die.

So as innovation typically changes value, innovation may also have a negative or destructive effect as new developments clear away or change old organizational forms and practices. Organizations that do not compensate effectively for innovative forces (mainly from outside) may be destroyed by those that do. Hence managing an organization typically involves risk. A key challenge in management is maintaining a balance between the current processes and business model.
Innovation has been studied in a variety of contexts, including in relation to technology, commerce, social systems, economic development, and policy construction. There are, therefore, naturally a wide range of approaches to conceptualizing innovation in the scholarly literature.