wheelspinner
Are We There Yet? Member
Nobody's perfect, I'm a nobody, so ...
Posts: 4,103
|
Post by wheelspinner on Oct 6, 2011 2:00:16 GMT -5
I suppose there will be endless paeans of praise for the man, so there is no need for me to add to them here.
I still have an uncomfortable feeling about Jobs; I always have. The fanboys and media love him, BUT:
1/ Apple relies on near-slave labour conditions in China to produce those lovely goodies we are buying from them. The pressure on their workers is so intense, there have been suicides as a result, about which Apple did squat. 2/ Far from being innovators, Apple manipulates patent laws to choke off competition by buying patents en masse for things they never invented and then wielding them against companies like Samsung 3/ At the same time, Apple has no regard for the intellectual property or trademarks of others. The perfect example is how Jobs trashed the court agreement with Apple Records over their use of Apple Records' TM. They routinely sue companies that have names close to Apple products, even when those companies pre-existed Apple's product. They regularly use their legal power to intimidate small businesses out of their rights. 4/ As far as I can tell, Jobs hasn't personally invented anything for decades. He is a genius at marketing, but far too many people believe that he personally invented things like the iPod. He didn't. 5/ Apple are price manipulators who charge discriminatory pricing in Australia. Even when the Australian $ was at parity with the $US, we were still expected to pay twice as much as Americans for digital downloads of music and the like. This is simply inexcusable price-gouging.
You will easily get the impression that I am not a fan of Apple the company. They turned into the very thing they mocked IBM for in their famous 1984 ad. Apple now is an uncaring behemoth of a company that seeks only to gouge customers, competitors and suppliers alike. They wrap what are pretty ordinary technological developments in an ephemera of cool design, which enables them to be as rapacious as they feel like being. They then use their financial and legal muscle to block competitors rather than beat the competition with quality and innovation.
It's sad that Jobs died, but I won't miss him. And his company could use some time in the sin-bin as well.
|
|
|
Post by Georgina on Oct 6, 2011 9:34:20 GMT -5
That was refreshing, WS. Apple has recently become The Evil Empire in my lexicon. I've refused to jump on board with all of the gadgetry particularly because of their attempts to control the sale and distribution of entertainment media. With iTunes, they're taking a fairly impressive shot at cornering the music distribution industry. I won't support or tolerate any one company or entity controlling access to art. If iTunes becomes the only venue for music purchases, then Apple begins to get to control what the public can even have access to. Um, no.
Anyway, yes, Jobs did a fabulous marketing and sales performance. I hadn't realised that he wasn't responsible for actually inventing any of our latest innovations. And yet, you know, WS, a huge hunk of people who I speak with daily believe that he did and venerate him for that. I doubt history will re-write it.
Current times are weird.
I'm sure someone else has thought that before, likely beginning a thousand or so years ago.
|
|
oskar
Are We There Yet? Member
Posts: 5,534
|
Post by oskar on Oct 7, 2011 17:16:05 GMT -5
|
|
|
Post by joethree56 on Oct 9, 2011 18:25:43 GMT -5
I too feel that the hero worship of Jobs is largely misplaced. For example his much vaunted operating system was largely ripped off from the Unix/Linux corner of the cupboard and his near paranoid obsession with total control of all aspects of the hardware/software integration on AppleMacs he sent them into a development blind alley. A bit like a Rolls Royce car, what the AppleMac does it does very well in an old fashioned rather boring insular way.
|
|
wheelspinner
Are We There Yet? Member
Nobody's perfect, I'm a nobody, so ...
Posts: 4,103
|
Post by wheelspinner on Oct 9, 2011 21:42:41 GMT -5
One thing I didn't mention is that the Macintosh was spawned by a Jobs visit to Xerox PARC, where his hosts showed him the working prototype of their top-secret R&D project. Jobs then rushed back to Cupertino and commissioned his Apple developers to steal Xerox's idea. Massive hypocrisy from the man who was such a ruthless guardian of his own company's IP - even where they simply buy the idea from somebody else as an anti-competitive measure.
|
|
|
Post by joethree56 on Oct 13, 2011 20:07:08 GMT -5
The American computer scientist Dennis Ritchie, who has died aged 70 after a long illness, was one of the co-inventors of the Unix operating system and the C programming language. Unix and C provided the infrastructure software and tools that created much of today's computing environment – from the internet to smartphones – and so have played a central part in shaping the modern world.
The origins of Unix go back to the 1960s, long before the microchip and personal computers had been invented. The nearest thing to personal computing was the so-called computer utility. This consisted of a large mainframe computer that was used simultaneously, and at great expense, by a couple of dozen users sitting at typewriter terminals.
By the middle of the decade, the computer utility appeared to provide the way ahead, and a consortium of General Electric, Bell Labs and the Massachusetts Institute of Technology (MIT) embarked on a project called Multics (Multiplexed Information and Computing Service). Multics would be the world's largest computer utility, supporting several hundred simultaneous users. Bell Labs was responsible for the operating software.
Ritchie joined the programming division of Bell Labs in 1967. His father, Alistair Ritchie, had had a long career there, and had co-authored an influential technical book, The Design of Switching Circuits (1951). Dennis was born to Alistair and and his wife Jean in the northern New York suburb of Bronxville, and grew up in New Jersey, where Bell Labs had its Murray Hill site. He studied physics and applied mathematics for a bachelor's degree (1963) and computer science for a PhD (1968) at Harvard University.
Multics was in crisis when he arrived at the research organisation. Indeed, many big software projects were in crisis – people were just beginning to learn that writing large programs was horrendously difficult and costly. In 1969, after four years of development, Bell Labs pulled out of the project.
Ritchie and another lead programmer on Multics, Ken Thompson, were left somewhat bereft by the project's demise. Multics promised a wonderful computing experience, but the operating system was too complex to build. This led them to rethink their software philosophy. They would build a simpler, smaller system that they would call Unix – the name was "a kind of treacherous pun on Multics", Ritchie once explained.
The idea was not immediately appreciated by their managers, and they had to "scrounge around" for an obsolete computer to develop Unix. The computer had just 16 kilobytes of memory, and this alone was an encouragement to keep things simple. If Multics was the victim of baroque software architecture, then Unix would be pure Bauhaus.
Unix was designed over a period of a few months in 1969, and a prototype was running early the following year. Their colleagues remained unconvinced. However, by offering to write some text-processing software, Ritchie and Thompson managed to persuade the Bell Labs patent department to acquire a full-size computer and run Unix on it.
They decided to rewrite the operating system entirely for the new machine. The first version of Unix had been written in the computers' native machine code, which was difficult and slow. For this next version of Unix, Ritchie invented a new language called C, which bridged the gap between machine code and programming languages such as Fortran and Cobol.
C also had an interesting ancestry. The progenitor was a language jointly designed at Cambridge and London universities in 1964 and known as CPL (Combined Programming Language). CPL never survived, but one of the development team, Martin Richards, became a visitor at MIT. There he designed a simpler version of the language for systems implementation, BCPL (Basic CPL).
Once Thompson and Ritchie discovered BCPL, they decided to use it for writing Unix: to do so they squeezed it into 8 kilobytes and renamed it B. Finally, a new and improved version was developed and named C, which, Ritchie mused, "left open the question whether the name represented a progression through the alphabet or through the letters BCPL".
C made writing software immeasurably easier and it also made software portable – so that a program written in C could run on any machine. The new version of Unix was completed in 1973, and since it was written in C, it, too, was portable.
Because Bell Labs's parent, AT&T, was a regulated telephone monopoly, it was prohibited from competing in the computer industry, and so had no pecuniary interest in Unix. This allowed Ritchie and Thompson to distribute Unix free of charge to universities and research institutions, which loved its clean, economical design.
Universities began to train their students in Unix and C, and when they graduated they took the culture into industry, where it blossomed. In 1978 Ritchie and a colleague, Brian Kernighan, wrote a textbook, The C Programming Language, which became a bestselling programming primer for the next 15 years. Despite the prosaic title, it was equally a book about programming style, and it shaped programming practices worldwide.
Ritchie and Thompson got early recognition for their work when they received the 1983 Turing award of the Association of Computing Machinery, often dubbed the Nobel prize of computing. But the Unix story was just beginning. The Advanced Projects Research Agency of the US department of defence adopted Unix for the network research that eventually created the internet, and it remains the software glue that binds everything together.
Steve Jobs was a Unix devotee. When he was ousted from Apple Computer in 1985, he used Unix as the basis for his NeXT computer workstation. After his return to Apple ten years later, he brought Unix with him and it became the foundation for all of Apple's current products.
Unix is also at the heart of today's open-source software movement. In the 1980s, following deregulation, AT&T began to assert its intellectual property rights in Unix. A Finnish computer science student named Linus Torvalds decided that the world needed a free version of Unix, which became known as Linux. The system was written by hundreds of programmers, mostly steeped in the Unix and C culture, collaborating over the internet. Today, the free Linux operating system powers billions of electronic devices, from smartphones to set-top boxes.
Ritchie and Thompson – usually together – received many honours and awards, culminating with the National Medal of Science awarded by President Clinton in 1998. The citation described their inventions has having "led to enormous advances in hardware, software, and networking systems and stimulated the growth of an entire industry." Earlier this year, the pair won a Japan prize. Ritchie spent all his career at Bell Labs, retiring as head of systems software research in 2007.
• Dennis MacAlistair Ritchie, computer scientist, born 9 September 1941; died 8 October 2011
• Dennis Ritchie's Bell Labs homepage
|
|
|
Post by joethree56 on Oct 13, 2011 20:08:33 GMT -5
Sorry the links didn't show. It was of course from the Guardian
|
|
|
Post by joethree56 on Oct 20, 2011 12:17:15 GMT -5
Mike Daisey's The Agony and the Ecstasy of Steve Jobs opened a day after Apple's co-founder was laid to rest and portrays him as hero and villain of the piece. But has it come too soon? Steve Jobs The day Steve Jobs died, Mike Daisey stayed up all night, in a darkened room, lit only by the glow of his MacBook Pro, reading a cache of personal emails from Apple's founder. He was searching for fresh insights into the genius behind the iPod, the iPhone and the iPad, but no matter how long he stared at the screen, they remained elusive. The man was gone. Daisey has been an Apple fan since childhood, but in the course of writing his latest monologue, The Agony and the Ecstasy of Steve Jobs – it's been in the works for years, but with uncanny timing has just opened at the Public Theatre here in New York – the relationship deepened to the point of obsession. The show is a love song to Apple's gorgeous, transformative devices, but also a vehement attack on the way they are produced, at a vast, dehumanising factory complex in southern China, where workers assemble laptops and smart phones for next to nothing, under constant surveillance. Jobs is portrayed as a "visionary asshole" who trampled on friends in his quest for perfection. He changed the way we understand and engage with the world three times, Daisey argues, but he was also a manipulative egotist, an unbearably demanding, capricious boss and a ruthless businessman who would do anything to achieve market dominance. Seeing his flaws ripped open and hearing about his seeming indifference to the miserable conditions at Apple's outsourced factories is a bracing antidote to the hagiography that has dominated since Jobs's untimely death the week before last. But is it too soon? The show opened a day after Jobs was laid to rest. The love affair with technology that began when Daisey's grandfather bought him an Apple IIc in 1984 eventually took him to the Special Economic Zone of Shenzhen, in southern China, where roughly 50% of the world's consumer electronics are made. Daisey stood outside the gates of the Foxconn Technology plant, which employs 430,000 people, and asked workers to tell him stories about the conditions inside. He heard stories of a hand crippled by a decade on the assembly line, workers threatened with life in prison for joining a union and 13-year-old girls doing 13-hour shifts. The factory was briefly in the news last year, when it installed nets under the top floor windows, following a rash of suicides, but otherwise it may as well operate in a black hole. Chances are your phone was made there, but most people know nothing about the place. It came as a revelation to me that most electronic devices are assembled by hand, rather than by robots. Daisey weaves his trip to Shenzhen into a history of Apple, from the company's guerrilla origins, flying a pirate flag over Silicon Valley, via its near-death experience in the 1990s, to modern-day ubiquity, in a performance that is often hilarious and always intense. Jobs is both hero and villain of the piece, as it charts his rise, fall and rise again. But Daisey denies he's unfair to Jobs. "If the goal was to slander the man, there's ample ammunition," he says, "the show would be six hours long." It ends with Daisey telling the audience "tonight is a virus," and asking them to email Apple's new CEO, Tim Cook, to call for independent supervision in Shenzhen. His messages from Jobs were replies to such requests. Reading through them, Daisey concluded that Apple's founder "turned his back on these things a long time ago". In this, Jobs had the rest of the industry for company (many major technology companies are clients of Foxconn), not to mention anyone who has ever owned a laptop or MP3 player, but it still came as a shock to discover that he did not "think different" when it came to exploitation in China. Its subject might still seem raw, but The Agony and the Ecstasy of Steve Jobs is a welcome rebuke to the sanctification of a technological pioneer – and something that points out that reputations are often more complex than they seem. © 2011 Guardian News and Media Limited or its affiliated companies. All rights reserved. www.guardian.co.uk/stage/theatreblog/2011/oct/20/steve-jobs-apple-new-play/print
|
|
wheelspinner
Are We There Yet? Member
Nobody's perfect, I'm a nobody, so ...
Posts: 4,103
|
Post by wheelspinner on Oct 20, 2011 19:25:36 GMT -5
But is it too soon? The show opened a day after Jobs was laid to rest.
On the contrary, now is exactly the right time to be correcting the record before the mythology takes over and becomes accepted fact. We owe it to the victims of Jobs' career to recognise the price they paid for his ascendancy.
|
|
wheelspinner
Are We There Yet? Member
Nobody's perfect, I'm a nobody, so ...
Posts: 4,103
|
Post by wheelspinner on Oct 20, 2011 19:30:53 GMT -5
BTW, thanks for posting about Ritchie's death. He was an equally significant figure in the history of the computer industry, but his passing did not attract nearly the attention that Jobs' did, because he was not a publicity-seeking self-marketer.
At Infosys' main campus in Bangalore, the conference rooms are named after significant figures in maths and science. There is a Ritchie room; there isn't a Jobs room.
|
|