Advertisement

IBM and Apple - A True Story of a Big Brother We do a great deal of innovation work, and work with many companies on developing their innovations, and are always inspired by the way that small companies can take on big companies, and win. I've already showcased on great local company: Bright Red Books here, so I'd thought I'd have a look at two companies who have coped with change, and who know sit side by side. One company is 100 years old (IBM) and the other is a new kid on the block (Apple). As we stand on the creation of an amazing new Cyber Age, one must wonder as to which of these companies will still be going in 100 years time? Oh I wish I had a time machine! My money is on IBM. Introduction This week it was announced that IBM were likely to purchase over 200,000 Apple Mac computers over the next year, and shows that it has taken thirty years, but Apple's vision of breaking the mold of computer systems has finally come true. As a quick estimate, assuming that the Macbooks will have the best of specification, I reckon that the total spend could be: £400,000,000
IBM and Apple - A True Story of a Big Brother

IBM and Apple - A True Story of a Big Brother


We do a great deal of innovation work, and work with many companies on developing their innovations, and are always inspired by the way that small companies can take on big companies, and win. I've already showcased on great local company: Bright Red Books here, so I'd thought I'd have a look at two companies who have coped with change, and who know sit side by side. One company is 100 years old (IBM) and the other is a new kid on the block (Apple).

As we stand on the creation of an amazing new Cyber Age, one must wonder as to which of these companies will still be going in 100 years time? Oh I wish I had a time machine! My money is on IBM.


Introduction


This week it was announced that IBM were likely to purchase over 200,000 Apple Mac computers over the next year, and shows that it has taken thirty years, but Apple's vision of breaking the mold of computer systems has finally come true.



As a quick estimate, assuming that the Macbooks will have the best of specification, I reckon that the total spend could be:
£400,000,000

It also showed that IBM were now fully out of the low-end market, and where their engineers are more likely to be seen with Macbooks than Thinkpads. With IBM, too, on its 100th birthday, it is amazing to see a company that has been knocked out of the market it created, thriving in new and exciting areas.

The computer company graveyard is full of companies who led for a while, but eventually struggled in the market place, and often taken over by other up-and-coming companies. Many might remember Sinclair, Commodore and Atari for their strides breaking new ground with their gaming computers, or DEC pushing forward with their microcomputer, or even Compaq creating one of the first laptops.

All managed to fail in some, and be overtaken (or be taken over), but for IBM they have slugged it with the best of them, and gracefully left markets which they could not compete, and broke successfully into others. For a large company, this isn't an easy judgement. For Xerox we show a company who innovated many of the key ideas for the computer, but failed to see them through. Both Apple and IBM have shown that they can not only innovate, but also execute.
[next]

Apple and Big Brother


In 1984, Apple used the year of Big Brother to showcase George Orwell's vision of the future, and, for Apple, they were showing that Apple aimed to smash the existing monopoly that companies such as IBM and DEC had with their computer systems:



To increase the hype the advertisement was only shown once, but where it would have the maximum impact: within the commercial break at the 1984 Super Bowl final. At the time the drive for Apple was around their new Apple Macintosh computer, and it was left to the young (and suited) Steve Jobs to showcase the true power of the Mac:



Unfortunately, for Apple, the Macintosh was rather unpowered, but it did shame the conventional approach to computer systems with its GUI (Graphical User Interface). Many of the ideas for the Macintosh, too, were inspired by Steve's visit to the Xerox PARC site:

[next]

Apple's best decision ever!


After the release of the Macintosh, Apple were struggling, especially as they were swimming against the PC tide. They used the IBM processor set which had an architecture which worked in the opposite way to the Intel one. They thus struggled to continue to development of the system, and had to be bailed-out by Bill Gates. Steve Jobs then made two key decisions:
Switch to the Intel x86 architecture. This meant that the hardware for the Mac was the same as any PC.
Dump their existing operating system and adopt Unix at the core. Underneath the user interface was thus a well-tested operation system, in which there were many developers and system architectures around who could work with the system.

These were radical changes, and basically stopped the ship, and jumped off, and created a completely new ship. The changes saved Apple, as they did not have to develop new hardware (or at least use expensive hardware) and could adopt the methods used for Unix as the kernel for their system. The genius was then not to use the horrible X.11 Windows system, but to concentrate on making the user interface beautiful - as that is what makes the Mac stand-out (and not its underlying core).

Apple’s decision to redevelop a new operating system for the Macintosh based on Unix was a momentous one. A family of related operating systems, Unix has evolved since the early 1970s and continues to be used and developed today. Technically OS X is a “Unix-like” operating system called Darwin; Linux is another Unix-like operating system. This decision meant the company could rely on the stability of Unix and focus on the user experience.
[next]

Pheonix from the flames


With the sale of first its desktop PC business and its server business to Chinese partner Lenovo, IBM has come full circle. By exiting the hardware business IBM leaves behind the low-end market it invented and returns to its roots in high performance computers, software, and a focus on the client.

With the sale of their low-end server business to Lenovo, IBM completed their track from a company who led the industry for 70-odd years, and then produced the PC and became and became an also-ran. After a 35-year detour, they have now fully returned back to their roots: in producing high performance computers; leading on computer software integration; and with a focus on the client and their needs. The full circle is perhaps highlighted by one of the great computers of the past: IBM System 360, which led the industry for decades. For IBM the generic selling of computer boxes has never been seen as something that is likely to be a conduit for them to innovate and lead. For the creator of the ATM and the hard disk, IBM have thrown off the shackles of the last 40 years, and are back where they want to be.

The current trends within the computing industry are clear for all to see, as smaller hand-held units become the device of choice for consumers, and computing power is not being bundled into clusters, where resources can be shared across the cluster, and provided on an on-demand basis. The desktop and the low-end servers become just the end part of a large-scale infrastructure providing an elastic computing provision – which is basically a mainframe, which one which has lots of sub-computers, and which can be easily scale-up and ripped-down.

The world has changed to since IBM created the PC, with its 4.7MHz clock, and 640kB of memory. Now we have hand-held devices running with four processing cores, and run more that 1,000 times faster, and more than 25,000 times the memory, than the original PC. In fact, it is not the processing power and memory capacity of a single device that is the key thing these days, it is the ability to add it into a cluster that provides the most interest for many companies. A failure in any part of a computer used to cause many problems, but with a clustered infrastructure the failure is not noticed by the users, as the data and processing power is mirrored across itself.

So we have moved on from the days of computer systems connecting to a network, and we now see many computers connected together to create a self healing infrastructure, and where it is fairly easily to add new cluster elements as the focus for building high-performance clusters, and which are used to process and analyse large data sets. It is this high-end market which IBM see as the future, and many companies too see their ability to succeed on the market place based on their ability to use data analytics to drive forward.

IBM have generally developed to have a broad range of products which span everything from application software to computer hardware. In the 1960s and 1970s, large computer companies, such as DEC, defined the standards involved in the industry, and the IBM PC changed this, and provided a platform for generalised hardware, which companies could quickly copy and define new industry standards. The definitions of a layered approach to networking, also allowed companies to specialise as horizontal integrators, and they were able to move faster and innovate than the vertical integrators.

Turning a Large Ship Around


Within the computing industry, companies have to spot opportunities, and make sure they move their product provision to take advantage of market changes. It is an industry in which leading companies can go from boom to bust in a short time. There are thus many examples of companies, at their peak, failing to spot changes in the market, including Compaq, Sun Microsystems and DEC, and who became fixated on a certain product range, and fail to see the evolution of the market.

Even Apple struggled for a time in the 1990s to find markets for its hardware, and struggled to move the industry away from Microsoft Windows and the IBM PC, and move the industry towards its own operating system and computer hardware. For them they struggled against the impact of the IBM PC, and, almost as a last resort, adopted the same hardware that was used in the IBM PC (as their previous computers used the IBM microprocessor which had a different way of running software than the Intel microprocessors used on the PC that IBM developed), and in integrating the Linux operating system.

Both of these changes considerably reduced their investments in the unseen parts of a PC, and focused their concentration on the parts the user was most interested in: the usability of the computer. In 2009, Apple completed their transformation with MAC OS Snow Leopard which only supported Intel-based architectures. Probably, apart from the IBM transformation, one of the smartest moves ever seen in the Computing industry. For a company such as IBM, who have based their product range on technical innovations, the route taken by Apple was not really one that IBM would have ever have felt comfortable about.
[next]

Cloud, Cloud and More Cloud


While many companies in the computing industry, especially ones focused on desktop systems, such as Dell, are trying to understand what their existing product range should be and where they need to development, IBM have provided one of the best examples of how a large corporation can lead within an industry sector, and then detect where their impact is failing, and go forward and transform themselves with a renewed focus.

For such as large organisation, IBM have managed to do this seamlessly, and where they have come out as one of the leaders of the pack in Cloud Computing and Big Data. For IBM, they made large-scale computers, and took a detour for 40 odd years, but they have now gone back to their roots. As one of the first companies to create main frame computers, and create one of the first programming languages – FORTRAN (created in 1957, and still used in the industry) – they are now back in the place they find most comfortable, and in supporting business sectors rather than computing needs.

Microsoft, Intel and Apple have successful managed to plot paths though rapid changes in the computing industry, and have kept themselves in business for over 40 years, and still innovating and leading the market in certain areas. While Apple and Intel have continued to invest in hardware development, IBM spotted a while back that the requirement for low-end computer hardware, especially for desktops, would offer very little in the way of long-term profitability. So, in 2005, they signalled the first move of getting out of low-level hardware by selling off their PC business in, and this is now complete with the sale of the low-level server business, both to Lenovo.

The computing market, which was based around desktop computers from the 1970s until recently, is now focusing on mobile devices, which do not use the architecture developed initially by IBM, and on high-end servers which run Cloud Computing infrastructures. The need for “bare-metal” servers, where one operating system runs on one machine, is reducing fast, as high-end ones now are capable of running many servers and hosts, at the same time.

IBM has thus identified that it is the high-end market which will provide the future, especially in applying Big Data analysis to their range of services – and to become more service-oriented and develop in more profitable areas. These signs can also be seen within the IT security industry where the need for security products, such as firewalls, staying fairly static, but the demand for security consultancy services and support rapidly increases.

At one time, one operating system ran on one computer, as the hardware could only cope with this. Once the computing power existed within a single machine to run more than one operating system at a time, and still give acceptable performance, was the beginning of the end for the low-level server market.

Big Data


The requirement and market for hardware remains fairly static, but Cloud Computing and Big Data processing continues to expand fast, and it is one which highlights the increasing dependence that many market sectors have on the provision of Web services.

The amazing thing for IBM is that they have moved from a company which was built on defining hardware standards and controlling the industry, to one that is built on software and high-performance systems, and one that embraces open standards (especially for open source software). They have thus transformed their company from a hardware company to a software one, and who lead the world. It is still seen as one of the most innovative companies in the world (including five Nobel Prizes and numerous awards for scientific impact, including inventing the ATM, magnetic stripe cards, relational databases, floppy disks and hard disks), and one with a strong brand image.

Their renewed focus goes back to their roots of the 1950s, with their lead within mainframe computers, and it is now built around their advanced computing infrastructure. In the 1990s, IBM spotted that increasing power of computers with the defeat of Garry Kasparov by the IBM Deep Blue computer. While the real mastery was just the sheer power of searching through millions of possible moves, and finding the best, they continued with their focus onto defeating humans in the areas where they triumphed … understanding the English language. With this the IBM Watson managed to beat human opponents within Jeopardy!, and then managed to have a higher success rate in lung cancer diagnosis than leading cancer specialists. For the cancer diagnosis, Watson was sent back to medical school, and learnt how to spot lung cancer signs by analysing a whole range of unstructured data and using natural language processing.

[next]

Happy Birthday - The History of IBM


One of the first occurrences of computer technology occurred in the USA in the 1880s. It was due to the American Constitution demanding that a survey is undertaken every 10 years. As the population in the USA increased, it took an increasing amount of time to produce the statistics. By the 1880s, it looked likely that the 1880 survey would not be complete until 1890. To overcome this, Herman Hollerith (who worked for the Government) devised a ma-chine which accepted punch cards with information on them. These cards allowed a current to pass through a hole when there was a hole present.

Hollerith’s electromechanical machine was extremely successful and used in the 1890 and 1900 Censuses. He even founded the company that would later become International Business Machines (IBM): CTR (Computer Tabulating Recording). Unfortunately, Hollerith’s business fell into financial difficulties and was saved by a young salesman at CTR, named Tom Watson, who recognized the potential of selling punch card-based calculating machines to American business. He eventually took over the company Watson, and, in the 1920s, he renamed it International Business Machines Corporation (IBM). After this, electromechanical machines were speeded up and improved. Electromechnical computers would soon lead to electronic computers, using valves.

Figure: Punch cards


After the creation of ENIAC, progress was fast in the computer industry and, by 1948, small electronic computers were being produced in quantity within five years (2000 were in use), in 1961 it was 10000, 1970 100000. IBM, at the time, had a considerable share of the computer market, so much so that a complaint was filed against them alleging monopolistic practices in its computer business, in violation of the Sherman Act. By January 1954, the US District Court made a final judgment on the complaint against IBM. For this, a ‘consent decree’ was then signed by IBM, which placed limitations on how IBM conducts business with respect to ‘electronic data processing machines’.



In 1954, the IBM 650 was built and was considered the workhorse of the industry at the time (which sold about 1000 machines, and used valves). In November 1956, IBM showed how innovative they were by developing the first hard disk, the RAMAC 305. It was towering by today’s standards, with 50 two-foot diameter platters, giving a total capacity of 5MB. Around the same time, the Massachusetts Institute of Technology produced the first transistorised computer: the TX-O (Transistorized Experimental computer). Seeing the potential of the transistor, IBM quickly switched from valves to transistors and, in 1959, they produced the first commercial transistorised computer. This was the IBM 7090/7094 series, and it dominated the computer market for years.

In 1959, IBM built the first commercial transistorised computer named the IBM 7090/7094 series, which dominated the computer market for many years. In 1960, in New York, IBM went on to develop the first automatic mass-production facility for transistors. In 1963, the Digital Equipment Company (DEC) sold their first minicomputer, to Atomic Energy of Canada. DEC would become the main competitor to IBM, but eventually fail as they dismissed the growth in the personal computer market.

The second generation of computers started in 1961 when the great innovator, Fairchild Semiconductor, released the first commercial integrated circuit. In the next two years, significant advances were made in the interfaces to computer systems. The first was by Teletype who produced the Model 33 keyboard and punched-tape terminal. It was a classic design and was on many of the available systems. The other advance was by Douglas Engelbart who received a patent for the mouse-pointing device for computers. The production of transistors increased, and each year brought a significant decrease in their size.

The third generation of computers started in 1965 with the use of integrated circuits rather than discrete transistors. IBM again was innovative and created the System/360 main-frame. In the course of history, it was a true classic computer. Then, in 1970, IBM introduced the System/370, which included semiconductor memories. All of the computers were very expensive (approx. $1000000), and were the great computing workhorses of the time. Unfortunately, they were extremely expensive to purchase and maintain. Most companies had to lease their computer systems, as they could not afford to purchase them.
[next]

Figure: IBM System/360


As IBM happily clung to their mainframe market, several new companies were working away to erode their share. DEC would be the first, with their minicomputer, but it would be the PC companies of the future who would finally overtake them. The beginning of their loss of market share can be traced to the development of the microprocessor, and to one company: Intel. In 1967, though, IBM again showed their leadership in the computer industry by developing the first floppy disk. The growing electronics industry started to entice new companies to specialize in key areas, such as International Research who applied for a patent for a method of constructing double-sided magnetic tape utilizing a Mumetal foil inter layer.

The beginning of the slide for IBM occurred in 1968, when Robert Noyce and Gordon Moore left Fairchild Semiconductors and met up with Andy Grove to found Intel Corporation. To raise the required finance they went to a venture capitalist named Arthur Rock. He quickly found the required start-up finance, as Robert Noyce was well known for being the person who first put more than one transistor of a piece of silicon. At the same time, IBM scientist John Cocke and others completed a prototype scientific computer called the ACS, which used some RISC (Reduced Instruction Set Computer) concepts. Unfortunately, the project was cancelled because it was not compatible with the IBM’s System/360 computers.

In 1969, Hewlett-Packard branched into the world of digital electronics with the world’s first desktop scientific calculator: the HP 9100A. At the time, the electronics industry was producing cheap pocket calculators, which led to the development of affordable computers, when the Japanese company Busicom commissioned Intel to produce a set of between eight and 12 ICs for a calculator. Then instead of designing a complete set of ICs, Ted Hoff, at Intel, designed an integrated circuit chip that could receive instructions, and perform simple integrated functions on data. The design became the 4004 microprocessor. Intel produced a set of ICs, which could be programmed to perform different tasks. These were the first ever microprocessors and soon Intel (short for Integrated Electronics) produced a general-purpose 4-bit microprocessor, named the 4004. In April 1970, Wayne Pickette proposed to Intel that they use the computer-on-a-chip for the Busicom project. Then, in December, Gilbert Hyatt filed a patent application entitled ‘Single Chip Integrated Circuit Computer Architecture’, the first basic patent on the micro-processor.

The 4004 caused a revolution in the electronics industry as previous electronic systems had a fixed functionality. With this processor, the functionality could be programmed by software. Amazingly, by today’s standards, it could only handle four bits of data at a time (a nibble), contained 2000 transistors, had 46 instructions and allowed 4KB of program code and 1KB of data. From this humble start, the PC has since evolved using Intel microprocessors. Intel had previously been an innovative company, and had produced the first memory device (static RAM, which uses six transistors for each bit stored in memory), the first DRAM (dynamic memory, which uses only one transistor for each bit stored in memory) and the first EPROM (which allows data to be downloaded to a device, which is then permanent-ly stored).

In the same year, Intel announced the 1KB RAM chip, which was a significant increase over previously produced memory chip. Around the same time, one of Intel’s major partners, and also, as history has shown, competitors, Advanced Micro Devices (AMD) Incorporated was founded. It was started when Jerry Sanders and seven others left – yes, you’ve guessed it, Fairchild Semiconductor. The incubator for the electronics industry was producing many spin-off companies.

At the same time, the Xerox Corporation gathered a team at the Palo Alto Research Center (PARC) and gave them the objective of creating ‘the architecture of information.’ It would lead to many of the great developments of computing, including personal distributed computing, graphical user interfaces, the first commercial mouse, bit-mapped displays, Ethernet, client/server architecture, object-oriented programming, laser printing and many of the basic protocols of the Internet. Few research centers have ever been as creative, and forward thinking as PARC was over those years.

In 1971, Gary Boone, of Texas Instruments, filed a patent application relating to a single-chip computer and the microprocessor was released in November. Also in the same year, Intel copied the 4004 microprocessor to Busicom, and then in 1974, Intel was a truly innovative company, and was the first to develop an 8-bit microprocessor. Excited by the new 8-bit microprocessors, two kids from a private high school, Bill Gates and Paul Allen, rushed out to buy the new 8008 device. This they believed would be the beginning of the end of the large, and expensive, mainframes (such as the IBM range) and minicomputers (such as the DEC PDP range). They bought the processors for the high price of $360 (possibly, a joke at the expense of the IBM System/360 mainframe), but even they could not make it support BASIC programming. Instead, they formed the Traf-O-Data company and used the 8008 to analyse tickertape read-outs of cars passing in a street. The company would close down in the following year (1973) after it had made $20000, but from this enterprising start, one of the leading computer companies in the world would grow: Microsoft (although it would initially be called Micro-soft).

At the end of the 1970s, IBM’s virtual monopoly on computer systems started to erode from the high-powered end as DEC developed their range of minicomputers and from the low-powered-end by companies developing computers based around the newly available 8-bit micro­processors, such as the 6502 and the Z80. IBM’s main contenders, other than DEC, were Apple and Commodore who introduced a new type of computer – the personal computer (PC). The leading systems, at the time, were the Apple I and the Commodore PET. These captured the interest of the home user and for the first time individuals had access to cheap computing power. These flagship computers spawned many others, such as the Sinclair ZX80/ZX81, the BBC microcomputer, the Sinclair Spectrum, the Commodore Vic-20 and the classic Apple II (all of which where based on the 6502 or Z80). Most of these computers were aimed at the lower end of the market and were mainly used for playing games and not for business applications. IBM finally decided, with the advice of Bill Gates, to use the 8088 for its version of the PC, and not, as they had first thought, to use the 8080 device. Microsoft also persuaded IBM to introduce the IBM PC with a minimum of 64KB RAM, instead of the 16KB that IBM planned.
[next]



In 1973, the model for future computer systems occurred at Xerox’s PARC, when the Alto workstation was demonstrated with a bit mapped screen (showing the Cookie Monster, from Sesame Street). The following year, at Xerox, Bob Metcalfe demonstrated the Ethernet networking technology, which was destined to become the standard local area networking technique. It was far from perfect, as computers contended with each other for access to the network, but it was cheap and simple, and it worked relatively well.

IBM was also innovating at the time, creating a cheap floppy disk drive. They also produced the IBM 3340 hard disk unit (a Winchester disk) which had a recording head which sat on a cushion of air, 18 millionths of an inch above the platter. The disk was made with four platters, each was 8-inches in diameter, giving a total capacity of 70MB.



The days of IBM leading the field very quickly became numbered as Compaqmanaged to reverse engineering the software which allowed the operating system to talk to the hardware – BIOS. Once they did this IBM struggled to set standards in the industry, and had several attempts to define new operating systems such as OS/2 and in defining new computer architectures, with MCA bus standard. The industry decided that common standards were more important than ones defined by a single company.
[next]

Conclusions


Companies in the past have depended on the individual computers, but increasingly it is their use as part of a cloud and the capacity to collect, store and process data that is their most important asset. The cloud – the combined horsepower of thousands of companies and website software services – will become the most powerful distributed computer ever built.

IBM have shown it is moving towards this vision, which could bring benefits to every citizen, especially in evolving areas such as in health care and education. For Apple too they dedicated themselves to the things people really wanted, and with sales of the Mac on the up, we will see if Apple can survive 100 years, as IBM has.

IBM do have a strong eye for stopping what works, and for them it is: Big Data, The Cloud and Health Care. Now IBM is not quite a Big Brother from 1984, but a Big Brother as part of the family, and IBM and Apple have a close relationship in software development, as the iPad is a device of choice in many industries. Both too have a Linux kinship, which both companies selecting it as their core platform.

So who will be around in 100 years time? My money is on IBM ... as the need for hardware reduces by the day, and the need for The Cloud increases! If you are interested, we have a Symposium coming up which looks at the move towards software-defined infrastructures, and you will hopefully see the future:

William Buchanan
Written by

iTech Dunya

iTech Dunya

iTech Dunya is a technology blog that specializes in guides, reviews, how-to's, and tips about a broad range of tech-related topics..

Post A Comment:

0 comments: