|
Post by thurstan on Jul 1, 2009 12:11:36 GMT -5
www.forbes.com/asap/2000/0821/087.htmlInteresting article about what computers will be like in 2010 from a 2000 perspective. The thing I take from this is that we ALWAYS over estimate how technology will advance. We look back at ideas in the 70s,80s etc and laugh at how we over estimated back then, but then go and do it ourselves! Its obvious that in 10 to 15 years technology will be very similar to what we have today except faster, smaller, more efficient etc. Sure the look and aesthetics may be very different, but still wont be that different. Look at cars for example.
|
|
|
Post by relayer on Jul 2, 2009 13:11:20 GMT -5
|
|
|
Post by David Murray on Jul 3, 2009 16:24:35 GMT -5
You know, it is funny really. When people predicted what today would be like in movies and books back decades ago they all got it wrong. We have no flying cars, there are no people on the moon or mars, and we don't get all of our food from little pills.
But the interesting thing is all of the advanced technology that we managed to create that was never even dreamed of. Such as the cell-phone revolution. Or iPods, tiny camcorders, or even laptop computers for that matter.
Sometimes it is really hard to see what will be popular, even for me. For example, I remember reading about 802.11 wireless technology 10 years ago. The Apple iBook clamshell had just come out and they really hyped the wireless feature (albiet very expensive back then) I remember thinking to myself how that was a feature I would probably never use. I couldn't see why I would want to pay for that when I was perfectly happy plugging my laptop into ethernet cables around the house and office. But now, I would not own a laptop if it did not have wireless. I'd rather give up sound or some other feature than to give up wireless.
It is also funny that the article just posted was talking all about holographic hard drives, etc.. but no mention of flash memory which has really transformed our storage and is on the verge of making optical media obsolete.
|
|
|
Post by thurstan on Jul 4, 2009 3:29:07 GMT -5
Osborne and Shugart seemed very blinkered and close minded about technology. "Oh 3.5" floppies will not catch on, dont be silly"
Showed that whilst they were great engineers etc, they were not visionaries at all or just not equipped to comment on the future.
Really it is best to look at current technology, think how it will advance but still stay pretty much the same. Add in some new tech or experimental tech and you will have a more realistic future view of tech.
Holographic storage I think is the next big thing, they are making good progress with that from what I have seen. Optical computing still has a long way to go, Intel and co are very creative with silicon!
|
|
|
Post by Harmik on Jul 7, 2009 6:41:34 GMT -5
I have had The Computer Chronicles in my favorites for a long time, and have been meaning watch them all, will have to get onto that.
|
|
|
Post by thurstan on Jul 7, 2009 16:39:13 GMT -5
Adam Osborne and George Morrow really irritate me on Computer Chronicles god rest their souls. They are really stuffy and a bit narrow minded about things with very blinkered views.
|
|
|
Post by cadorbolin on Jul 9, 2009 7:46:47 GMT -5
The major innovations that I see as having a huge impact on computing over the past 10 years are:
1. Wireless internet
This brought portable computing and is definitely having a huge impact on the sale of CAT-5 cables. In fact, Earl will probably make a podcast episode about how the internet was like in the mid 1990s: dialup, big routers that were the size of a PC tower, etc.
2. Solid State Storage Devices
While this is about 10 times more expensive than traditional hard disk storage, the latter will go the way of drum storage. No moving parts and the ease of connecting an external storage device (that can be carried around in a keychain). Thanks to these, you can carry your entire music collection and listen to your favorite songs over and over again until you get sick and tired of them.
3. Netbooks (like the Asus eee)
Smaller is better -- the days of carrying a huge monstrous albatross that weighs 10 pounds that can run the latest version of Windows are over. My eee weighs 2 pounds, runs the most effective anti-virus OS out there (Linux) and I can take it almost anywhere.
4. Net 2.0 (Google, Youtube, Facebook, Linux)
In the 1960s and 1970s, most computing was done by timesharing and teletype terminals. The 1980s and 1990s broke away from this thanks to the personal computer industry and you bought software to install software locally on the machine. But the more things change, the more they stay the same. The terminal-->server model is back, and with a vengeance. You can do word processing and spreadsheet applications on Google. Enjoy a never-ending high school reunion with Facebook. I add Linux here, because it is really an internet-based OS. Whenever I need a piece of software, I simply type "sudo apt-get install xxx" to download it from the repositories. However, Linux is not really mainstream and I'm probably betraying my personal bias towards that OS.
Before there was Youtube, if you wanted to share a video with someone, they had to go through a huge rigmarole of installing codecs, etc. Using a very common standard (adobe flash), it brought video sharing to the masses.
The scary part of Net 2.0 is that it is much easier to implement the paranoid fantasy of "New World Order". As much as we love computers, it is thanks to them that the government and nefarious aspects of the private sector can impose a totalitarian state.
As the "Linux Action Show" podcast says "Some day, THE BORG will run on Linux".
The fear of computers in the 1940s was that they would eventually become "sentient" and take over by getting rid of human beings (the classic arcade game Robotron 2084 is based on this theme). They got it wrong, of course. It's not the computers that will do bad things to us--but the bad people who use them.
---------------------------------
I'm digressing a little here...
But yeah, the article was way off in their predictions, optical computing still has not replaced TTL. Back in the 1990s, "virtual reality" was all the rage, and the helmet looked cool. But what is cool doesn't mean that it's practical. Is it cool to have a computer that is controlled entirely by voice/hand waving without a keyboard or mouse? Sure, but it becomes very impractical after 5 minutes when the "coolness" factor wears off.
So we are still within the paradigm of TTL-based CPU--and this is being maxed out (all of the easy low-hanging fruit has been picked off in terms of finding inefficiencies and they can't keep overclocking to meet the unsustainable standards of Moore's law anymore) and the manufacturers have reached a plateau. Instead of throwing greater GHz requirements at software (which is what Microsoft did with their bloatware called Vista), the software companies will have to learn to make their programs "leaner and meaner" (like Microsoft is attempting to do with Windows 7).
|
|
|
Post by retrobits on Jul 9, 2009 14:50:17 GMT -5
The semiconductor industry has been very creative in keeping up with Moore's law, although they are starting to run out of steam. And, I think that speed increases in and of themselves are no longer directly helpful. I don't need the hardware to get faster, really. I need the software to stop bloating at 1.5x the rate at which the hardware gets faster.
The Netbooks are an example - these awesome little devices run from pretty spartan CPUs. They do what they do, and they do it well. I can't wait to get one for myself.
Windows 7 seems like step in the right direction. I've been using the RC for a while now, and it's good stuff. However, it's still heavy. From the support and development side of things, whenever I work on modern day OSes (Linux and OS X included, sigh) I feel like there's a ton of weight hanging over me, ready to fall at any moment. It's just too unwieldy.
I don't know what I'd predict (and I've thought a lot about it). But I know what I hope for...a computing platform that is modular, loosely coupled but with tight cohesion, solid as a rock, and efficient to use and develop for. I've surveyed the landscape, and nothing yet meets those lofty goals. In fact, it seems to be getting worse before it gets better...
|
|
|
Post by thurstan on Jul 9, 2009 16:57:40 GMT -5
Sooner or later hardware performance is going to plateau (with the current technology) and developers are going to be forced to code better and more efficiently. Exactly as they did back in the 8bit-16bit days. You had your fixed hardware platform and that was it, so developers/coders/programmers really learnt the hardware inside out and got fantastic results out of the limited hardware. This plateau will be the marking point for when current tech becomes retro!!
Microsoft are slowly realising that bloatware is an issue, look at Windows 2008 core and Windows 7 as an example.
Google has announced their OS and that as a good indicator of things to come. Clean, efficient OS designed for the internet.
Now we need a hardware company to come up with a new computer architecture, to reboot the computer industry again!
|
|