This is a tricky one. I’m looking for some technology from any time in the last 20 years or so with the following characteristics:
1. It is ubiqutous.
2. It was done right–ie, it’s a good thing they did it THAT way instead of some other way.
3. It is either true or believable that it COULD have been done some other way that wouldn’t be as good.
For example, the 26-key pad on the I-phone makes life better than if everyone had to triple-tap. Unfortunately, that fails #3.
Any and all suggestions welcome. Thanks!
Teflon almost wasn’t. if someone had not become curious as to where the missing mass went… no Teflon.
To snag low-hanging fruit…
I would say the internet.
1. It’s certainly ubiquitous.
2. With the goal being an easy, unrestricted way to share information of all kinds (text, audio, video) quickly over vast distances it is awesome. It has been adapted and expanded by many, many people to have these capabilities.
3. It COULD have been done in some other way. For example, the net is still affected by its history of being a group of technologies that DIDN’T want to restrict people from uploading or accessing things. (or at least, in its early days, security and accountability weren’t big on the feature list.) This is why it’s so damn hard to prevent info or files from leaking or being copied on it; it wasn’t built with technologies in place that track every single upload to one specific user and there’s still a lot of backwards compatibility…nobody seems keen on attempting to roll out some major new protocol that significantly changes this. While the lack of accountability is bad in cases of, say, child porn, it’s good for people who have legitimate causes to talk about that some other group of people might want to stifle. And there are security overlays for things people agree need to be secure, such as finance stuff.
Certainly, if it had started out as restricted as some groups want it to be, it wouldn’t have spread as quickly in popularity, and wouldn’t have grown as quickly. Offices would have a lot more paper.
Perhaps compare it to how cell phones have a snarl of contracts and surcharges to use things that just “come with the package” on hardwired internet.
I wish I could elaborate on this more…I have just enough IT training to know the internet could have gone a very, very different direction than it did, and to be thankful it hasn’t (even though some groups are trying very hard to change that). Perhaps someone with more in-depth IT knowledge can get specific on what I’m trying to say. I’m being really damn vague and I wish I wasn’t.
The Internet. Wow. Might be able to make that work. We certainly wouldn’t be lacking in Chutzpah. Can those decisions (ie, internet freedom) be traced to a known small group of people who were considering doing it another way?
Whoops, I meant to clarify the first part of that: because they made the ends responsible instead of the network, they effectively made the network itself be hands off on the data and the ends responsible for everything, which is a primary cause of the freedom (anyone can run a server at any endpoint).
Umm, except apparently I previewed but never posted the thing I was clarifying. So that post was meant to be after this one:
http://en.wikipedia.org/wiki/History_of_the_Internet#TCP.2FIP
“By 1973, [Kahn and Cerf] had soon worked out a fundamental reformulation, where the differences between network protocols were hidden by using a common internetwork protocol, and instead of the network being responsible for reliability, as in the ARPANET, the hosts became responsible.”
@skzb said: “Can those decisions (ie, internet freedom) be traced to a known small group of people who were considering doing it another way?”
Not necessarily on a worldwide scale, but there are certainly governments that have (and do) greatly restrict and monitor their citizens’ ‘net access.
Building on D. M. Domini’s point, you could compare email (free, adaptable, you can set up your own server, etc) to SMS messages (expensive, pay more to send picture/video, roaming fees).
Alternatively how about building $600 cell phones out of plastic & metal compared to building them out of glass.
http://www.redmondpie.com/apple-iphone-4s-vs.-samsung-galaxy-s-ii-drop-test-video/
But maybe building cell phones out of glass doesn’t pass characteristic 3.
As much as I dislike this one – Televison and Radio. The US method of licencing allowed multiple networks and programming competing for audience – allowed programming (such as rock music) to gain audience and thus shift programming. The UK method restricted all programming to BBC only – thus driving the pirate radio stations (you could not listen to the Beatles on UK radio stations during their heyday), and thus slowing the growth of programming.
“Can those decisions (ie, internet freedom) be traced to a known small group of people who were considering doing it another way?”
Sure – think of Compuserve or the pre-internet AOL. Sure, you could communicate and share files, and even be somewhat anonymous, but they could arbitrarily decide to take down your files or posts, or restrict what topics were allowed on the message boards.
The Kindle.
1) With kindle format sales now approaching (or exceeding) sales of tangible books, the ubiquity of the Kindle seems to be established.
2) What set the Kindle apart from prior e-ink books was the ease of ordering a book electronically.
3) There were any number of e-ink products prior to the Kindle which failed to integrate the reader and the ordering mechanism. The integration is what revolutionized the e-book.
The i-phone app store.
1) With tens of thousands of apps, the app store is ubiquitous.
2) Allowing third party publishers to use the app store to sell apps in one marketplace drove costs way down for the publisher and the consumer.
3) Instead of allowing third parties to use the app store, Apple could have tried to prevent consumers from buying third party apps or it could have limited the app store to apps developed in-house. Notably Research In Motion had a device that was capable of running apps and failed to develop an app store until after Apple lit the path.
Mapquest (and any other ad-based internet service provider).
1) Virtually everyone with access to a computer uses Mapquest and the other services that provide free mapping services as a way to plan travel.
2) Allowing free internet access is much better for the consumer than forcing consumers to buy software and/or a device in order to access the information.
3) All of the GPS sellers (Garmin, ToTom, etc.) are based on a sales model. They are currently struggling to find a way to compete
Suppose we keep the internet, but lose the web? What if CERN had decided that Gopher, FTP and email were sufficient, and didn’t back the development of the www?
At a minimum, the WWW or an alternate that is superior to Gopher could have been much later, delaying or aborting the .com bubble, rolling a lot of things we experienced down the years, if they ever happen at all.
To build on my Twitter comment: The decision to make a “byte” be 8 bits rather than 6 turns out to have been a fortunate one. It was explicitly made by the IBM team designing the System 360 (a machine with many other pioneering inventions) and their logic was laid out in detail. As it happens, it was fortunate both for the effects (faster processing due to handling more bits in parallel, easier circuitry down the line such as when dealing with multi-byte floating point operations, more efficient RAM circuitry, slightly more ease in developing Unicode) and for some of the guiding rationale, emphasizing backward compatibility and the concept of a “family” of computer processors.
One reason they considered the 6-bit byte would be for efficiency in dealing with ASCII text — storing large text files would have been cheaper on such a system, and the integer arithmetic wouldn’t have been affected that badly. It would have been a reasonable decision.
If everyone had then copied them, then computer performance would have been mildly hobbled, and the spread of US-designed computers to other countries not using ASCII would probably have been reduced.
But the main reason I suggest it is that they picked it in part for compatibility with other hardware, including hardware down the road. This was an important idea, and if they had just swept it aside in favor of just perfecting *this* machine, then that attitude (which was already common, and, coming from Big Blue in a major new offering, would have been influential) could have seriously hampered and delayed the later personal computer revolution.
So I guess the invention in question is really “backward/sideways compatibility” but that one’s harder to pinpoint :)
(Oh, and to echo Vnend above: I distinctly remember being shown the first web browser. I didn’t see the point — fortunately more visionary heads prevailed on that one, but I can easily see how they might not have.)
What about Blu-ray vs HD DVD? Betamax vs VHS?
It would be easier to look at technology we have now that the “powers that be” tried to screw up in the first place.
DVDs were an alternative format to DIVX, which was a DVD you bought, but then had to pay every time you wanted to watch it. Just imagine if that had taken off over DVD. It would have been the model for every other bit of media. CDs you had to pay to listen to each time. Books you had to pay per page to read. Video games you had to pay per hour to play. The internet would probably turn into someplace where you had to pay to visit every page.
Another one is the music industry opposition to MP3 players. They sued to stop the first one and fortunately failed. But if they succeeded, there would be no iPod or iPhone. It would have gotten rid of the idea of format shifting, so you would have to buy a CD, and then buy the music again as an MP3 (or whatever DRM-ed format was picked), or worse have a fragmentted market that the music industry was going to where each company had their own digital music store, their own DRM, and their own devices it worked on. If you liked music from 3 different companies you would have to carry 3 different devices.
Or go back even further to the court case against the VCR. If the supreme court believed that it was as bad as the Boston Strangler as it was claimed by the movie industry at the time, we would never have had VCRs, video rentals, home video collections, DVDs, Blueray, DVRs, video recorder, and probably not cell phones with cameras built in. And the movie industry would be much smaller since they now make so much money from their video sales. Arguably, home video sales make TV shows more profitable, and could be part of the reason for the explosion of cable stations.
There are lots of court cases where different industries nearly shot themselves in the foot fighting progress.
Good stuff. Thank you all.
Right now, the one that grabs me as working neatest for what I want is the I-Phone app store thing. Is it believable that Apple *almost* decided to keep it all proprietary? Based on the little I know, it seems that could be believable.
Google. Look at Google, then look at every search engine that existed before it. No one had to hit upon THAT algorithm. And in any given world where it DID exist, no one had to combine it with an utterly simple interface. And then image searches. And video. And maps. And… well, just pick a point where Google could have not thought to include “X.”
GPS technology. Nearly magical. It could have been done any number of ways, but they chose to use signal timing. It’s certainly ubiquitous. And available to civilians only the last 20 years.
I think it’s hard to hold up the iPhone Apple store as something done right, unless by right you mean ‘maximizing profit’.
While Apple’s terms and limitations for creating apps on their store aren’t very restrictive, they’re still a little restrictive, and worse the device is a closed system–if an app can’t get on the app store it’s not available for the device, period. (Unless the device is hacked aka ‘jail-broken’.)
But it does guarantee Apple gets a slice any sale made for something on the platform (and this is the source of some of the restrictions people have run into, involving in-app purchasing and such). Good for Apple’s bottom line, but not really good for anybody else.
While an entirely closed device (that nobody could develop apps for) would be far worse, it’s still a far cry from an environment like Windows where Microsoft gets nothing at all when a third party sells an application for it.
Hmm…good point, Sean. Maybe GPS. Someone on twitter suggested ac vs dc, but that’s older than I’d like, and I think was driven by how they worked, rather than someone’s decision.
Besides Sean’s points, keeping Apps on the iPhone limited to what Apple built would have flown in the face of every successful hardware/software pairing in history. Has there ever been a successful, completely closed platform?
Vnend: Well, both IBM and Apple did exactly that for many years. That’s why us software types in the old days hated them both.
Very cool and useful, all. Here’s the snippet we’re trying to replace, on the theory that more information might help you help us more:
“This is crazy.”
“It gets crazier.”
“I can hardly wait. What are the impossible parts?”
“We’ll get there. Let’s start with the merely improbable. Do you have an I-phone?”
“Huh?” Her brows came together. “Yes.”
“Does it make your life easier that you can use all twenty-six keys for letters instead of triple-tapping?”
“Of course.”
“You’re welcome.”
She stared, waiting for me to say more.
“It was a very close decision at Apple, how to do the keypad. That’s the sort of thing you can do with oxytocin and dopamine and a few words in the right ears.”
—
and thanks so much for all the ideas. Steve knows some very cool folks!
Oh, by the way. Everyone, meet Skyler White, with whom I wrote the book in question. Go buy her books on account of they rock.
http://www.amazon.com/Dreams-Begin-Skyler-White/dp/B005K5PB3Q/ref=sr_1_1?s=books&ie=UTF8&qid=1319341698&sr=1-1
http://www.amazon.com/Falling-Fly-Skyler-White/dp/B003TO6E8G/ref=sr_1_3?s=books&ie=UTF8&qid=1319341698&sr=1-3
Debit cards.
Visa Launched its first check card in 1995, and then debit cards took off. The tech is ubiquitous, fairly straightforward and simple to use, and could most definitely have been done in different ways that wouldn’t have been as good.
Plus there just a whole lot of nefarious possibilities attached to the shift from cash to electronic “money” transfers. (Probably even some fictional ones!) Plus, I do actually know people who don’t have iPhones/Pads/Kindles/ etc. I even know some people who don’t have credit cards (by choice) but very damn few people don’t have a debit card.
Follow the money.
Steve, I bought my first Apple in 1979 (or was it ’78?). It never ran any Apple software other than its OS; everything else was either BASIC or assembly from various sources. Meanwhile, Visicalc and Wordstar were the breakout programs for the IBM PC, and I know Wordstar was not an IBM product.
As far as the iOS keyboard, I think the FITALY solution is vastly superior. Enough so that I am mildly surprised that Apple didn’t buy the company and use it.
“It never ran any Apple software other than its OS”
Unless I’m missing something, that’s my point.
Apple and IBM were notorious back then because they had weird connections that didn’t fit with the industry, so you had to do it THEIR way.
Sean:
Why should making a profit disqualify a technology? If we disqualify any technology that was developed (at least in part) in an attempt to turn a profit, there would be a very small list of candidates. Moreover, the app store drastically reduced costs for the consumer. You can buy apps for the iPhone for a few dollars that just a few years prior would have cost more than $40 to install on your desktop. Yes, Apple gets a slice of the pie. Apple made our lives measurably better with the iPhone and the app store, why shouldn’t it turn a profit?
“It never ran any Apple software other than its OS”
Huh? There was lots of commercial Apple ][ software – including Visicalc, which was an Apple program before there even was an IBM PC. Word processors, educational stuff – it wouldn’t have taken off like it did if all it had was the OS and Basic.
But that does remind me of something that fits, even if it’s really obscure – the difference between Apple and IBM 5″ floppy format was because Apple used a variable speed drive which allowed more data to be stored on the outside sectors, while IBM used a fixed speed so every sector had the same amount of data, regardless of its physical size.
If you’re really looking to replace that section, then you can hardly go wrong with the layout of the phone keypad in the first place. Ever notice that it’s different from a calculator?
That’s because Bell Labs did the research.
http://www.apa.org/research/action/button.aspx
http://www.vcalc.net/Keyboard.htm
… although that was more like 50 years ago. Sorry!
Internet: close to perfect, but still vulnerable in a day-to-day sense by the plutocracy of DNS for those who rely upon it.
Google: handy, but not an order of magnitude better than competitors, and the price to be paid in privacy is yet to be rendered.
Kindle: 1984.
GPS: spoiled by the gov’ts that cripple resolution for non-military.
appStore: spoiled by censorship and non-competitive behaviour.
teflon: awesome, but can become toxic and still doesn’t better a well-seasoned cast iron skillet.
My candidate: mobile phone networks.
If you look hard enough, you can, for $10 a month, get ulimited instances of 127-character written word sent across space to any person of your choice in a near-instant. For $30 or so, you can send your very own sonorous voice. Long distance charges are a thing of the past. Wired handsets are a thing of the past. Payphones are a thing of the past. Roadside emergencies are a thing of the past. And, perhaps sadly, ‘planning’ is a thing of the past.
fossil fuel energy, it is used anywhere, it’s cheap plentiful and convenient. Ok that is cheap and plentiful are two common lies , that we tell our self
I disagree with the Internet. It has a large number of inefficiencies most people don’t know about because they’re hidden from you. It wastes large amounts of resources repackaging the same data over and over, with unused fields that burn your bandwidth. It’s just not that good, and could have been done better.
The things I would say — the op amp, microwave, mechanical pencil — are all more than 20 years old.
Non-dial-up connectivity, in all its current permutations.
Broadband over cable caused, in a very real sense, the explosion of what people can, and are willing to, do on the internet.
Cell phones, and the 3G/4G/etc data streams that service them; same thing. (I get looked at like I should have crinoids nesting in my hair when I tell people that I use my phone exclusively to do voice and the occasional text message, and don’t *want* more bells and whistles.)
Fiber optics, though this comes with a fairly robust caveat (that being “the folks who actually control the backbone are causing artificial scarcity to maintain profit margins”). The amount of data that a single FO thread can handle borders on ridiculous (think Gibson’s depiction of data flow viewed from afar; this approaches terabits per second).
Or, hell – open source software. It’s less organized than a corporate project, but it’s still fecund ground for “hey, it would be cool to do ______” and there’s essentially no reason not to, assuming you’re clever enough to figure out how (insert obvious parallel to writing a story [here]).
What about the good old computer mouse? As an input device it’s fast, intuitive and cheap. Now that mice are optical, dust can’t clog up my roller ball they work pretty dependably.
That said, I’ve seen alternatives to mice out there for people with repetitive strain issues, and those alternative techs could have quite easily become the controller of choice if they had been marketed first.
The digital watch, with digital letters.
Everyone now reads time in a certain way. 10:45 is almost completely ten- Forty-five instead of quarter to 11.
It is sort of the metric system of time telling… it cleans out all the uncertainties and makes it accessible as a universal system. And it is EVERYWHERE if you stop and think about. It even starts to replace gauges of all sorts…
If you’ve ever seen attempts at fancy digital watches, you know there are tons of other (and less useful) ways of going it.
Hell we could all be wearing cuckoo clock faces on our watches!
What about the Compact Disc? It’s ubiquitous. It certainly seems to have been done right, and I remember from reading the bio of the man behind it that he very consciously set the capacity goal for a CD such that a single disc could contain the entirety of a Beethoven symphony he loved, in its entirety. That speaks to the notion that it could have been done differently (and probably worse).
CD’s, Digital Watches, mouse, fibre optics, phone keypad, Apple OS’s are all older than 20 years.
OK, now I’m confused…
I said that if Apple had kept the iPhone’s software limited to only what Apple wrote, it would be going in the face of the proven hardware/software paradigm, and asked if there had been a successful closed system (the only software available from the hardware vendor).
I thought you replied that both Apple and IBM were initially just such systems. I pointed out that the only Apple software I ran on my original Apple ][ was the OS, everything else was from somewhere else.
IE, not a closed system as I (intended to) describe originally.
Vnend: Oh. Okay, I misunderstood.
Are you looking for only computer based technologies? There are a lot of other kinds of technology out there. Medical, chemical, scietific, agricultural, automotive, electical, etc. Examples like the artifical heart or DNA testing, the plastics industry, the Hubble Space Telescope, modern fertilizers, kinteic energy absorbing body designs, graphene semi-conductors.
Also, you should consider “ubiqutous.” Computers are pretty common in the US, Japan, the UK, and other first world countries, but they are kind of limited in second- and third-world countries. In africa, or many parts of Asia, or the Phillipines or East Indies, computers are the toys of governments, big businesses, and industries. Cell phones are much more common in that regard.
If you are only thinking of computer stuff though, and besides my first choice of the humble cell phone, I would have to say the calculator. Yes, it has been around a longer time than you specify, but I know of the improvements in their capabilities being a science teacher. Most people in first world countries have them, and there are a lots of ways to make them that depends on the intended use, which covers both #2 and #3 qualifications.
(The problem with calculators is that they have killed the ability of kids to do arithmitic in their heads. Trust me, every year, only about 5-10% of all students in all my classes can do mental math. I do multiplication problems in my head at the board, and the students think I am doing black magic or something. And it’s not just me: I have heard this from multiple teachers in multiple school districts, and in two different states.)
If you include non-computer based technologies, then there are lots and lots of options. One example from the late 60’s or the 70’s (older than you want I know, but…) is zinc plating of car bodies, frames and panels to prevent rusting. It is used not only in first world countries, but everywhere cars are used. It is good they do it that way because before cars were galvinized, they tended to rust out in 2-5 years and you had to buy a new one before it fell apart. Also, you can use other plating materials, or paints, but zinc is cheaper and easier to do than many others, including paints.
Another example, and one I mentioned before, is plastics. Just about everyone, everywhere in the world, uses plastics. Tribal people in Africa, South America, and Australia have plastic water bottles instead of animal skin water bags. We have used other things like steel, or ceramics, or animal bladders in the past, but the plastics are lighter than metal, tougher than ceramics, and can be cleaned and reused almost indefinitly unlike animal bladders. Watch the science channels and you’ll see them being used by even the poorest farmer in the farthest back water in whatever country you name.
If you had the power to magically remove all the plastic from the entire USA, almost everything we use every day would be rendered more-or-less useless instantly. Cars- no plastic around your battery, so no starting up when you turn the key…try a hand-crank instead. Food preservation and preperation cannot use saran warp or plastic bagging on the industrial scale, and you cannot use saran wrap or baggies on a small scale in the home, so getting food on the table goes back to cold rooms and canning…. Airplanes won’t get off the ground, computers don’t have all their little pieces parts that let them run, no cell phones, MP3 players, CD’s, DVD’s, many types of fabrics (nylon, polyester, rayon), no toys, no diapers, many common hand tools, many household items, some houses, insultation for phone lines, etc. ,etc., etc. Without plastic, all we have are some rocks and sticks, and we are restarting the Industrial Revolution from somewhere around 1920-ish, with the only advantage being almost 100 years of metallurgy and materials science knowledge at the new starting point.
OK, sorry I wrote so much, and you probably thought of it already, but I was up on my soapbox and enjoying myself…
Junglejim: No, I’m not comitted to computer technology. But it has to be something where it is reasonable to say, “It’s a really good thing they decided to do it that way, and they might not have.”
I believe that there was talk at one point of making HTML a proprietary format, and of requiring a licensing fee for its use. Had things progressed down that road, the development and adoption of the internet might have looked very different …
Constantine: Hmm. That might do it. Even if it isn’t true, it’s believable. :-)
My suggestion would be the adoption of the MP3 audio standard, which took place in the early 1990’s.
It’s not so much that Musicam won over ASPEC, but that some kind of compressed format was introduced and standardized. This doesn’t just mean that I’m not carrying a Discman or its descendants along with my iPhone. It also means that there’s been heavy commercial pressure to develop small, solid-state memory at prices affordable to the masses. Lots of knock-on effects there.
I think that will work. Thanks, Abi!
wikis on the internet
SKZB That takes out most of my ideas, since there are so many ways to those things, but maybe it can inspire somebody else.
One of the most momentous technology decisions of modern times was IBM’s decision to have an open OS on the 1981 PC. This took three parts. First, they signed a contract that allowed MS to sell the OS to anyone. Second, and maybe more amazing, they published a commented listing of the BIOS (basic input/output system) code. True, that was proprietary, but it wasn’t long before some clever entrepreneur set up a “clean room” or rather two. In one room they read the code and the comments and told the people in the other room what to program it to do and the programmers in the second room wrote the code that did it. The company (whose name I have forgotten) sold it to clone manufacturers. The third decision was to publicize the hardware design–I know much less about the legal aspects of that. I would conjecture that IBM’s reason for publishing the BIOS was to make it easy for third party programmers to write programs for the new machine. At least one manufacturer kept their OS secret and charged (or would have had anyone bought in) royalties to reveal the secret codes necessary to write programs for their machine. That was the TI 99/4A and it went down the tubes almost immediately for want of software.
These decisions were made as they were basically because the PC was essentially a skunk-works project inside Big Blue. Yes, it had been approved at the highest levels, but they didn’t have the personnel needed to go it alone, do the OS (although they did do the BIOS), write a full panoply of programs for potential users and so on.
Later on, IBM came to regret these decisions. They came out with PS/2, a new hardware design, and OS/2, a new OS, but these bombed too. Although IBM hung on for many years, they are now entirely out of the PC business.
I don’t know what Steve is planning to do with these responses, but for several years I have thought that if only I were a writer, I would write an alternate history sort of Sci-Fi, based on the premise that IBM had not opened their systems. If I had to hazard some guesses, I would speculate that Apple and Unix would be the only survivors by now. I never liked the Mac (essentially because I still do much of my work from the command line, that early macs lacked and later ones had, but it was essentially Unix) and I never could get comfortable with Unix. Too esoteric. It is possible that I might never have started using computers, but now I cannot live without them.
The only thing wrong with this answer is that it is not clear that the decisions taken were best possible, but they sure could have been a lot worse.
The PC was based on the 8086, and that was a poorly designed microprocessor. Its legacy issues still plague the PC, but may be eliminated finally in the next couple years. These were very low level issues that non-programmers wouldn’t have ever seen. It would have been marvelous if they had done a ground up redesign of the PC in 1990, when computers weren’t quite common enough that it would affect everyone the way it would now.
So, no, from the perspective of a computer designer that designed hardware for the Mac in the 19990’s and investigated development for the PC… no way. It wasn’t even close to ideal. It had one thing going for it — flexibility — but even that was relative to proprietary competition with little to no flexibility.
I haven’t read all the entries so I apologize if someone else has pointed this out but the decision to use Octal, rather than Decimal, as a number system might fit your criteria though maybe not so much for criteria #2. I am thinking as an entire society long before the creation of computers and computer programs.
This might not work for your purposes, but Sony did a fantastic job with their shutter mechanism (I hate to call it that since it’s electronic) in their HD cameras. The shutter has been done many different ways and no one has come close to them. Yet. It makes beautiful pics. Mimics film shutters.The main test are things like lightning and camera flashes. In most digital cameras what you get is a bar of light across you frame. Can’t use it. Sony gets the flashes I know and I think it can catch lightning, but I’m not sure.
I have to agree with Big Mike re the IBM PC. Not because the PC itself was an amazing work of engineering — it wasn’t, and it was inferior to several at the same time competitors on that level. But it was the first consumer level open hardware machine, and as such, it was a -huge- development.
On the PC, you could run whatever OS you wanted that the BIOS could load — so from there, we get mac on PC hardware, linux, and so on.
On the PC you could trivially swap out the graphics card, the hard drive (eventually), the CPU — so we’ve got competition in all those areas driving quality and power up, and price down.
Without the IBM PC, and the decisions that made it up, we might still be running closed hardware machines build a la carte, filled with unswappable components, costing ten times (or more) what modern computers cost and having a tenth the power.
Regarding the kindle and other e-books, they fail #2.
The problem with all forms of digital media at this time is there is no LEGAL way to transfer ownership of the “license”. This goes back to the bad old days where real books actually contained licenses forbidding transfer.
When people say internet, they really usually mean the world wide web which runs on top of the internet.
Regarding the PC, much of what happened with the PC was done because of anti-trust issues and their settlement with the government.
Part of the problem I see with fulfilling your requirements is that new tech always comes along so what was once the apex of design no longer is.
The concept of the USB (universal serial bus) is a great example. It creates a uniform method to connect a device to a computer allowing greater extendability. Thing is, as tech gets better, the need to improve the devices and protocols exist to make things more efficient.
The USB thumb drive is a great innovation, but as connectivity becomes easier and faster you have the switchover to the WIFI type devices as well as bluetooth.
Things change.
VHS tapes allowed people to watch movies when
they wanted. Pay per view allowed for exclusive
viewing outside of fixed locations. Cable allowed for more content and clearer transmission. Personal computers allowed for quicker processing of information.
Fax machines allowed for easy transmission of images.
Radio phones allowed for contact w/o needing
fixed wires.
Walkmans allowed for easy portable music listening.
VHS -> DVD -> Blue ray -> Digital media.
Pay Per View still exists, but appear to be dying.
Cable is being somewhat supplanted by the net, transmission over phone lines and small satellite receivers.
PC are being changed back into thin clients like ipads and the like with the use of cloud storage and to a lesser extent calculation.
Faxs are pretty much dead replaced by digital photo transmissions.
Radio Phones have been replaced by cell phones.
Walkmans have been replaced by MP3 Players.
Newspapers as a physical media is dying.
The net allows for cheap publishing but also ease of piracy and ways to communicate that bypass government controls.
In memory of Dennis Ritchie, I’ll cite the C programming language, though it’s been around for longer than 20 years. It is ubiquitous (in certain circles), it could (and has) been done much worse, and it did what it set out to do exceedingly well. Others in that category include TCP/IP and, to a lesser extent, HTML.
But I really came to ask Steve Brust if the village named Lorimel in the Viscount of Adrilankha novels is named for Lorimel of the many hands, a deity from Zelazny’s “Isle of the dead” (a rather hauntingly memorable novel, that). Thanks.
This is both a) too long ago and b) it went in the opposite direction (the bad idea won, for reasons to be seen shortly) but it’s a really good example and an interesting anecdote. So here goes:
We used leaded gasoline for about 60 years in the US instead of ethanol or an ethanol/gasoline mix because tetraethyl lead could be patented and ethanol couldn’t.
http://www.livinghistoryfarm.org/farminginthe50s/crops_05.html
nuncupatory: Not consciously, but now that you mention it, probably unconsciously. I love that novel.
802.11 Apple patented it.
E-ink not quite ubiquitous (yet)
minimally invasive surgery, is becoming ubiquitous in most hospitals. (can definitely be done worse ie lethal)
image stabilizers combined with digital auto-focus, auto whitebalance auto contrast, this enables point and shoot photography that still leads to decent pictures in the hands of an amateur. There are many ways each of those technologies could have/ have been screwed up.
mobile phones. specifically the security/encryption on them could have been much much worse.
flash memory, its cheap its quiet and energy efficient. It is esssentially the same technolgy as the old floppy disk which demonstrates how it could have been worse.
Double walled beer cans, they keep my beer cold :) [not ubiquitous though]
Styrofoam Cup.
Keeps cold things cold. Keeps hot things hot.
An absolute marvel of modern invention.
Really late to this party but I would say text messaging.
The character limit is actually its strength. If messages were longer they would call them emails.