Parents seem more concerned by the array of technology-oriented undergraduate education options these days than their kids. It's presumably still true that high school seniors feel instantly overwhelmed when their prospects for higher education are discussed, so adding in the complexity of choosing the appropriate sub-discipline in order to properly predict the fast-moving tech job market doesn't really induce any more panic in them than the word "college" does, all by itself. It's parents, then, who worry about choosing the appropriate specialization, picking the best geographical location for said specialization, and guessing about whether or not there'll be demand in four years for a graduate from a school that only teaches programming courses using Java.
Clearly, there are problems with treating technology-oriented higher education like more traditional disciplines, and it's obvious to posit that a higher-tech solution could improve things. That's nothing new -- a lot of people have been saying that educational institutions could take better advantage of a lot of new technologies in order to deliver a better education to more people, but usually the specifics have been either purposely avoided or killed in funding subcommittee. I was talking about this last week with a friend who has two kids, one about to stare down the barrel of the SATs, and we decided to come up with a few specifics. Here, then, are some ideas regarding how to better tech-oriented undergraduate education, and maybe even improve efficiency in the process.
The single most important thing that needs to change about tech education is the belief that specialization is a good and desirable educational goal. US News & World Report and The Princeton Review have it dead wrong; it's useless to list top schools in 40 niche categories, since most kids have no idea what they want to do with their lives. (And even if they do, they're likely restricted geographically by either funds or transportation costs, so the list's utility is at best handicapped.) Instead, it makes a whole lot more sense to rank schools by breadth of educational opportunity; that is, how many different areas of study are available, and how well do closely correlated fields rank in the aggregate? As an example, CMU is known for its computer science, fine arts, and drama programs. MIT is known for its technical programs. Chances are, a kid who wants to have some sort of computer-related job is going to rank MIT first, but if that ends up not working out, he'd probably be better off at CMU where there are other options available. Since the major problem these days for schools is actually graduating students, it makes sense to encourage selection of institutions based largely upon opportunities offered. It simply increases the likelihood of success.
I suspect that ranking schools based on breadth of opportunity will give a big boost to state schools, which is great because state schools have perhaps the best chances of making great strides through new technology. Playing off of their traditional strengths as both more affordable and more local choices, state schools could take advantage of their public sponsorship in order to make themselves even more available to the general population by making clever use of both the internet and another, oft-overlooked, undervalued public resource: libraries.
If we assume that the major education-related hurdles for the average undergrad are money, time, proximity, and availability of supplemental resources, then the local library becomes a blindingly obvious place to establish a virtual satellite campus. Library computers would be an option for people who don't have access to a computer otherwise, and access to course material -- both printed and electronic -- would be immediate, reducing barriers to getting work done. Also, libraries tend to be proximal to large numbers of people, and by making them the congregation point for students, group discussions and even in-person teaching assistant help sessions would be feasible. Tuition might be even lower with no physical infrastructure to maintain, and besides, increased enrollment numbers may well be able to make up for lowered tuition. And either way, the biggest win here would be that the campus becomes part of the students' existing community; one could argue that local babysitting services and other small businesses might get a boost from a program like this, because traditional college-town infrastructure would appear all over the state instead of concentrating in a few areas.
The final piece of the education overhaul follows logically from the virtual-campus idea: separate areas of the state could have different teaching philosophies and educational styles, even as they teach the school-mandated online course curriculum. This would effectively address the final issue technology-oriented undergrads face: how to predict the ever-shifting job market. Students would be able to tailor their education to the local job market, where trends are slower to change than they are nationally. Also, if there's a large local employer looking for specific skills, being able to work on one's education while interning for that employer becomes possible where it was not before. Extending that thought, I think a library-based virtual campus approach like this could possibly increase local employment rates, making it a win for local government as well.
This solution is of course highly speculative, and based on a number of assumptions. But it does address a good number of problems facing potential undergrad (and continuing education) students today, and it does so without introducing any new public services or utilities. It simply makes more effective use of existing ones, and does so while encouraging state-school attendance and bringing small businesses, white-collar employers, and communities together.
autocratic for the people.
Monday, November 21, 2005
Tuesday, November 15, 2005
In the comments for the previous post, Jeff responded:
In general, my philosophy here is to do right by both technical and non-technical readers, but there's no easy path to victory here. Welcome to software engineering hell.
The main reason I believe ports between the Revolution and 360 will be plentiful is because the systems are similar from a programming standpoint, and usually it's the difficulty of adapting existing game code that makes or breaks game ports. Art, sound, and other assets are generally easy to transfer from one platform to another as compared to game code, and for the next-gen consoles this looks to be especially true. We need to consider the Revolution and 360 from a software guy's point of view.
The 360 features an ATI GPU, 512MB RAM, and (theoretically) six simultaneously executing threads. The Revolution is believed to feature an ATI GPU, some nontrivial amount of RAM, and two to four simultaneously executing threads. Now, game programming has been traditionally a single-threaded affair, and almost all -- save Saturn and PS2 -- consoles have primarily afforded this traditional, single-threaded approach. The crazy new world of multithreaded consoles will not, I believe, usher in radically altered programming styles and game engines in the short run (this sentiment is not unique to me, and in fact Microsoft has already admitted that every 360 launch title is single-threaded), and even in the long run I find it pretty unlikely that many games will ever find a way to take advantage of more than three or four simultaneously executing threads. Game code simply doesn't break into separate, unrelated segments particularly well. (It's true.)
So, from an engine architecture standpoint it seems unlikely that a game designed for the 360 will make use of more simultaneously executing threads than the Revolution will be capable of, and most other aspects of the consoles are similar (the 360's hard drive isn't a standard feature, so it probably won't be relied upon by games). As far as the likely scenario that the Revolution hardware is simply slower or less capable than the 360's, well, Microsoft wants every 360 game to run at HDTV 720p, and the Revolution won't even support HD, so that's a big load off the caching and rendering systems right there.
That's the slightly expanded version of why I think a lot of 360 games will hit the Revolution as well. The hardware's similar enough that major portions of the game code should be reusable, and I think that's the primary hurdle when porting a title. As far as the PS3 goes, from a hardware standpoint it's a completely different animal, and early reports are that it's truly a bitch to work with. This time around, it may be Sony who ends up having to carry their own library with first-party titles.
Your main point towards the end of the article is that the Revolution will receive ports of 360 (and you didn't mention it, but probably PS3 games as well, though not at launch). But you also talk about how one of the Revolution's greatest attractions is the low price tag they achieved by "foregoing high horsepower processing." I am highly skeptical as to whether or not the Revolution will be able to handle games made for the 360 (or PS3). I think that them going with the low-cost approach will allow them to hit a wider audience, but at the cost of getting very few direct ports of other games. Any games for the Revolution will most likely be Rev-exclusive (though they could be ported to 360/PS3 and upgraded), and most games for 360/PS3 won't make it to Revolution. I imagine companies could modify the game to run on the Rev, but who would want to buy a game that toned down when it's available on another console with all the extra stuff?
In general, my philosophy here is to do right by both technical and non-technical readers, but there's no easy path to victory here. Welcome to software engineering hell.
The main reason I believe ports between the Revolution and 360 will be plentiful is because the systems are similar from a programming standpoint, and usually it's the difficulty of adapting existing game code that makes or breaks game ports. Art, sound, and other assets are generally easy to transfer from one platform to another as compared to game code, and for the next-gen consoles this looks to be especially true. We need to consider the Revolution and 360 from a software guy's point of view.
The 360 features an ATI GPU, 512MB RAM, and (theoretically) six simultaneously executing threads. The Revolution is believed to feature an ATI GPU, some nontrivial amount of RAM, and two to four simultaneously executing threads. Now, game programming has been traditionally a single-threaded affair, and almost all -- save Saturn and PS2 -- consoles have primarily afforded this traditional, single-threaded approach. The crazy new world of multithreaded consoles will not, I believe, usher in radically altered programming styles and game engines in the short run (this sentiment is not unique to me, and in fact Microsoft has already admitted that every 360 launch title is single-threaded), and even in the long run I find it pretty unlikely that many games will ever find a way to take advantage of more than three or four simultaneously executing threads. Game code simply doesn't break into separate, unrelated segments particularly well. (It's true.)
So, from an engine architecture standpoint it seems unlikely that a game designed for the 360 will make use of more simultaneously executing threads than the Revolution will be capable of, and most other aspects of the consoles are similar (the 360's hard drive isn't a standard feature, so it probably won't be relied upon by games). As far as the likely scenario that the Revolution hardware is simply slower or less capable than the 360's, well, Microsoft wants every 360 game to run at HDTV 720p, and the Revolution won't even support HD, so that's a big load off the caching and rendering systems right there.
That's the slightly expanded version of why I think a lot of 360 games will hit the Revolution as well. The hardware's similar enough that major portions of the game code should be reusable, and I think that's the primary hurdle when porting a title. As far as the PS3 goes, from a hardware standpoint it's a completely different animal, and early reports are that it's truly a bitch to work with. This time around, it may be Sony who ends up having to carry their own library with first-party titles.
The first XBox 360 reviews are out (1, 2), and shame on you if you can't predict the final scores. It's hardly surprising: launch titles, in particular first-party ones, trend towards crap with astonishing precision. They're warmed-over examples of popular genres (or, more lazily, simply warmed-over editions of old games) that hit it big on the last-generation systems, and they forego improved or innovative gameplay in favor of updated graphics (or, more lazily, increased resolution). It's definitely fair to say that first-party launch titles are crap, all the time.
... Well, not all the time, but certainly for the established players. Sony and Sega have proven themselves incapable of resisting the ol' sausage grinder, cranking out 80% lean arcade beat-em-ups, fighting games, and third-person shooters with their last two systems. SCEA launched the PlayStation with first-party gems Toshinden (fighting), Kileak (first-person shooter), and Raiden (side-scroller), and alongside the PS2 they birthed FantaVision (puzzle), and .. actually, that was the only one. Sega, for its part, presented new Saturn owners with Clockwork Knight (2d platformer), Daytona USA (racing), Virtua Fighter (fighting), and the admittedly excellent Panzer Dragoon. The Dreamcast wrought NFL2k (football), Sonic Adventure (3d platformer), and House of the Dead 2 (lightgun! .. but with no lightgun). And we won't even start on the XBox.
You may see where this is going. In contrast to other manufacturers, I claim that Nintendo generally produces excellent, blockbuster, system-selling titles to coincide with their system launches, and seems overall to more aggressively pursue innovation in its gameplay and player experiences. The Nintendo64 launched with two first-party games: Wave Race 64(a racer, but one which truly could never have existed before), PilotWings 64 (ok, it sucked), and Super Mario 64, a game many consider to still be one of the best ever made. The Gamecube launch, though uncharacteristically weak, nevertheless produced first-party titles Luigi's Mansion (eh), Wave Race Blue Storm (eh), and Animal Crossing, again, an excellent game which simply could not have existed on any prior console.
I harp on launch titles because I believe they provide fair insight into a company's philosophy. Reading down the list of acclaimed games for each system, there seems to be a good correlation between launch titles' quality and quantity, and overall library quality and quantity. The XBox, PlayStation and PS2 provide a veritable sea of unremarkable titles, with a few gems. The Saturn and Dreamcast provided more daring, intelligent games, but at the cost of a fairly small library. And the N64 and Gamecube have very small libraries of generally high quality games, usually with three or four titles that are considered works of art.
Looking at these libraries more closely, it's easy to see that cross-platform titles (ports) comprise a large percentage of the games on the Sony, Microsoft and Sega systems, but not on the Nintendo ones. Historically this was because Nintendo either carefully controlled publishing on their systems (NES, Super NES), because their hardware was radically different (N64, Gamecube), or because the comparatively low number of people who owned their systems made it not cost-effective to perform the port (N64, Gamecube). As a result, Nintendo is accustomed to, and comfortable with, sustaining their consoles using only their own games.
This approach has led to Nintendo's becoming, I believe, the undisputed king of innovative gameplay experiences. They've a history of launching weird and wonderful systems, and them being one of very few developers to put titles out for these weirdo systems. This willingness to lay it all on the line in order to try new things is very exciting and attractive to me, which is why, over the last few years, I've owned an N64 (not a PlayStation or Saturn) and a Gamecube (not a Dreamcast, PS2 or XBox). I look for games to provide novel experiences, and though Gran Turismo 4 might be an addictive, fun game, it's nothing new. But plastic bongos? Now that's new.
I'm very excited about this next generation of consoles, because I feel like Nintendo suddenly has a shot at blowing everyone's doors off. How so? I think they'll do it by making the most compelling case in each of the three key areas of contention in the console wars: overall size of the game library, price of the console, and quality of exclusive titles.
There's a good reason the Nintendo systems have been in third place over the last two console generations: people like choice, and they gravitate towards the system with the most games. In the past, as was mentioned above, Nintendo's consoles have had weird hardware. This results in fewer game ports, which results in fewer console sales, which results in fewer game ports. Fortunately, they've realized their error, and have made the proper course correction. This time around, the hardware's normal. Their next console, the Revolution, is powered by the same PowerPC-derived architecture as the XBox 360, and it facilitates multithreaded applications in a similar manner to both the 360 and the PlayStation 3. This is highly oversimplified, but the conclusion is valid: The next Nintendo system will have a low cost of entry for developers, and will very likely receive a whole lot of XBox 360 ports as a result. This is great for the average consumer, because the 360 is launching real soon now, and developers will be working on their second- or third-gen 360 titles by the time the Revolution rolls around. Since simultaneous release of games is a popular way to stretch marketing dollars, the Revolution will very likely be seeing a number of second-gen 360 games ported to it just as it launches. A large perceived library at launch is great, and goes hand-in-hand with the next reason the Revolution is going to succeed: price.
Although the official price of the Revolution hasn't been revealed, it's been made clear that it will undercut the 360 and PS3. This is in line with the Gamecube strategy, and Nintendo's desire to sell to a generally younger audience than the other systems. But this time around, if we assume that the Revolution and 360 will have similar libraries, then the cheaper system is clearly the logical choice for parents come Christmastime. (And, if current refurbished system prices are any indication, used 360s will still probably cost more than new Revolutions). Nintendo will be able to undercut the price of their competition by foregoing high horsepower processing and a lot of the extraneous media center stuff in order to deliver "a single-minded gaming device."
And finally, with their haste to address price and library size, it would make sense to assume that Nintendo has chosen to take a more mainstream route with the Revolution -- just making a 360 on the cheap. But again, they've chosen the path less traveled, and made the move no one else had the balls to: they've completely reinvented the controller, and in so doing they've guaranteed that their excellent first-party titles will not be replicable on any other system. (They've said that a normal controller will be made available as well, but it won't be the default.) So they'll reap the rewards of titles developed for other systems and ported to theirs, while ensuring the opposite won't work. It seems an excellent strategic move.
If you didn't watch the controller movie linked to above, please do. It's worth it.
In the end, Sony and Microsoft will still be hugely successful, and their future consoles will very likely enjoy as much success as the current ones have. But I think this time around Nintendo has managed to outsmart them, and I predict they'll have the system to own this time around.
... Well, not all the time, but certainly for the established players. Sony and Sega have proven themselves incapable of resisting the ol' sausage grinder, cranking out 80% lean arcade beat-em-ups, fighting games, and third-person shooters with their last two systems. SCEA launched the PlayStation with first-party gems Toshinden (fighting), Kileak (first-person shooter), and Raiden (side-scroller), and alongside the PS2 they birthed FantaVision (puzzle), and .. actually, that was the only one. Sega, for its part, presented new Saturn owners with Clockwork Knight (2d platformer), Daytona USA (racing), Virtua Fighter (fighting), and the admittedly excellent Panzer Dragoon. The Dreamcast wrought NFL2k (football), Sonic Adventure (3d platformer), and House of the Dead 2 (lightgun! .. but with no lightgun). And we won't even start on the XBox.
You may see where this is going. In contrast to other manufacturers, I claim that Nintendo generally produces excellent, blockbuster, system-selling titles to coincide with their system launches, and seems overall to more aggressively pursue innovation in its gameplay and player experiences. The Nintendo64 launched with two first-party games: Wave Race 64(a racer, but one which truly could never have existed before), PilotWings 64 (ok, it sucked), and Super Mario 64, a game many consider to still be one of the best ever made. The Gamecube launch, though uncharacteristically weak, nevertheless produced first-party titles Luigi's Mansion (eh), Wave Race Blue Storm (eh), and Animal Crossing, again, an excellent game which simply could not have existed on any prior console.
I harp on launch titles because I believe they provide fair insight into a company's philosophy. Reading down the list of acclaimed games for each system, there seems to be a good correlation between launch titles' quality and quantity, and overall library quality and quantity. The XBox, PlayStation and PS2 provide a veritable sea of unremarkable titles, with a few gems. The Saturn and Dreamcast provided more daring, intelligent games, but at the cost of a fairly small library. And the N64 and Gamecube have very small libraries of generally high quality games, usually with three or four titles that are considered works of art.
Looking at these libraries more closely, it's easy to see that cross-platform titles (ports) comprise a large percentage of the games on the Sony, Microsoft and Sega systems, but not on the Nintendo ones. Historically this was because Nintendo either carefully controlled publishing on their systems (NES, Super NES), because their hardware was radically different (N64, Gamecube), or because the comparatively low number of people who owned their systems made it not cost-effective to perform the port (N64, Gamecube). As a result, Nintendo is accustomed to, and comfortable with, sustaining their consoles using only their own games.
This approach has led to Nintendo's becoming, I believe, the undisputed king of innovative gameplay experiences. They've a history of launching weird and wonderful systems, and them being one of very few developers to put titles out for these weirdo systems. This willingness to lay it all on the line in order to try new things is very exciting and attractive to me, which is why, over the last few years, I've owned an N64 (not a PlayStation or Saturn) and a Gamecube (not a Dreamcast, PS2 or XBox). I look for games to provide novel experiences, and though Gran Turismo 4 might be an addictive, fun game, it's nothing new. But plastic bongos? Now that's new.
I'm very excited about this next generation of consoles, because I feel like Nintendo suddenly has a shot at blowing everyone's doors off. How so? I think they'll do it by making the most compelling case in each of the three key areas of contention in the console wars: overall size of the game library, price of the console, and quality of exclusive titles.
There's a good reason the Nintendo systems have been in third place over the last two console generations: people like choice, and they gravitate towards the system with the most games. In the past, as was mentioned above, Nintendo's consoles have had weird hardware. This results in fewer game ports, which results in fewer console sales, which results in fewer game ports. Fortunately, they've realized their error, and have made the proper course correction. This time around, the hardware's normal. Their next console, the Revolution, is powered by the same PowerPC-derived architecture as the XBox 360, and it facilitates multithreaded applications in a similar manner to both the 360 and the PlayStation 3. This is highly oversimplified, but the conclusion is valid: The next Nintendo system will have a low cost of entry for developers, and will very likely receive a whole lot of XBox 360 ports as a result. This is great for the average consumer, because the 360 is launching real soon now, and developers will be working on their second- or third-gen 360 titles by the time the Revolution rolls around. Since simultaneous release of games is a popular way to stretch marketing dollars, the Revolution will very likely be seeing a number of second-gen 360 games ported to it just as it launches. A large perceived library at launch is great, and goes hand-in-hand with the next reason the Revolution is going to succeed: price.
Although the official price of the Revolution hasn't been revealed, it's been made clear that it will undercut the 360 and PS3. This is in line with the Gamecube strategy, and Nintendo's desire to sell to a generally younger audience than the other systems. But this time around, if we assume that the Revolution and 360 will have similar libraries, then the cheaper system is clearly the logical choice for parents come Christmastime. (And, if current refurbished system prices are any indication, used 360s will still probably cost more than new Revolutions). Nintendo will be able to undercut the price of their competition by foregoing high horsepower processing and a lot of the extraneous media center stuff in order to deliver "a single-minded gaming device."
And finally, with their haste to address price and library size, it would make sense to assume that Nintendo has chosen to take a more mainstream route with the Revolution -- just making a 360 on the cheap. But again, they've chosen the path less traveled, and made the move no one else had the balls to: they've completely reinvented the controller, and in so doing they've guaranteed that their excellent first-party titles will not be replicable on any other system. (They've said that a normal controller will be made available as well, but it won't be the default.) So they'll reap the rewards of titles developed for other systems and ported to theirs, while ensuring the opposite won't work. It seems an excellent strategic move.
If you didn't watch the controller movie linked to above, please do. It's worth it.
In the end, Sony and Microsoft will still be hugely successful, and their future consoles will very likely enjoy as much success as the current ones have. But I think this time around Nintendo has managed to outsmart them, and I predict they'll have the system to own this time around.
Wednesday, November 09, 2005
John Siracusa makes an excellent argument for incorporating a second, backup hard drive into home desktop computers, in order to make the inevitable case of drive failure less catastrophic. His argument is Apple-specific, because that's his thing, but it's a pretty good idea across the board. Home computer users are now definitely encouraged to store their entire lives -- music, checkbooks, photo albums, email, porn, tax returns -- in one place, so it's inexcusable to avoid addressing failure of the fragile magnetic drive at the core of it all.
Now, the proposed iMac onboard-backup-drive idea is certainly a great one, and it does fit in well with Apple's position as a purveyor of end-to-end premium experiences (with equally premium pricing). But the backup idea is also dangerous, for the same reason any last line of defense is dangerous: it has to be perfect. In the case of a data backup device, let's define "perfect" as:
1. The saved data has to be up-to-date.
2. The saved data has to be usable.
To the first point, home users shouldn't have to think about backing up their data. It's a hardware issue, it's not their responsibility, and the onboard-backup approach handles this requirement very well if it's set up RAID-1 style (that is, both hard drives are always kept exactly the same -- data saved on one is simultaneously saved on the other). The user need not do anything at all to have an always up-to-date backup, and that's perfect. The requirement is met.
The problem with the onboard-backup approach is revealed with the second, more subtle requirement: the saved data has to be usable. This makes sense; clearly, a backup is useless if the data can't be retrieved, right? But let's rephrase it: the backup drive needs to be even less likely to fail than the primary drive. So we need to ensure that whatever caused the primary drive to fail, it isn't also going to affect the backup drive. This is paramount.
Hard drive failure is most commonly due to mechanical wear of the drive components over time (it's true), which is a problem here because mechanical wear results from usage, and the two drives would be experiencing similar levels of usage if they were mirroring one another. Additionally, we have to assume that in a home environment failure-prediction technologies are not particularly useful because they're too easily ignored -- and besides, many home computers are multi-user so there's no guarantee that every user with data on the drive will know that there's a problem. The kids might not tell mom and dad that the backup is toast until it's too late.
Laying aside the issue of mechanical failure, let's consider other likely reasons for in-home hard drive suicide: electrical problems (lightning or blackouts), physical problems (dropping, hitting or getting the computer wet), or software problems (viruses). For this backup to really be "perfect" and part of a truly premium experience, it must to be able to restore data even in the face of these threats. And, it must be able to survive while not sacrificing any of the requirements we've already enumerated. This is a difficult problem.
Fortunately, I think I not only have a solution, but I have one that fits nicely into the current Apple Airport hardware family. It would be a standalone unit combining a hard drive and wireless router, it would replace the current Airport Extreme, and it would look a lot like the Airport Express. The idea is to combine a home's internet access point (the router) with the backup device, and I believe it would actually solve all of the problems mentioned above.
By putting the backup device into the home's internet access point, the backup device becomes accessible to anyone on the home network. This opens the door to a number of possibilities, most notably that the access point can immediately and continuously warn anyone and everyone connected that there's a problem with the backup. This is sufficient because any machine that backs its data up to the access point must connect to it every so often in order to perform the backup, so anyone affected by the backup's failure -- even people who aren't connected constantly -- are made aware of the problem pretty quickly. Additionally, wary parents can instruct the access point to restrict or lock down internet access if the backup goes south. This would pretty much guarantee an incentive to make the local administrator aware of the problem.
As far as invisibility goes, the backup would occur wirelessly every night (or as a background process during the day), and either way it would go unnoticed by the user. Since the backup data is only transferred between the single computer and the access point, upstream bandwidth is unaffected. Internal network bandwidth is affected to some extent, but chances are the major bottleneck is related to the upstream connection anyway.
This periodic-backup scheme also positively affects reliability -- the backup hard drive is not in constant use as is the primary drive, so its operating life should be significantly longer. In a similar vein, separating the backup drive from the computer decreases the likelihood that accidents involving the power supply or physical machine will affect the backup, and placing the backup in a separate, locked-down device should make it more resilient, in general, to malicious software.
And finally, by making the backup device a separate piece of hardware, consumers need not upgrade their entire computers -- probably inadvertently losing data in the process! -- in order to enjoy the benefits of a reliable backup. Marketing would love it.
This theoretical wireless router / backup device can be made using existing parts. The hard drive would be a low-speed, low-heat, low-power laptop drive -- possibly the 120GB 5400 RPM drive from the current Powerbooks. The rest of the hardware will be straight from the current Airport line.
This device would replace the Airport Extreme, and would retain its price of $199. The marketing would be, "The Airport. Now with secure data backup. Sleep well."
Now, the proposed iMac onboard-backup-drive idea is certainly a great one, and it does fit in well with Apple's position as a purveyor of end-to-end premium experiences (with equally premium pricing). But the backup idea is also dangerous, for the same reason any last line of defense is dangerous: it has to be perfect. In the case of a data backup device, let's define "perfect" as:
1. The saved data has to be up-to-date.
2. The saved data has to be usable.
To the first point, home users shouldn't have to think about backing up their data. It's a hardware issue, it's not their responsibility, and the onboard-backup approach handles this requirement very well if it's set up RAID-1 style (that is, both hard drives are always kept exactly the same -- data saved on one is simultaneously saved on the other). The user need not do anything at all to have an always up-to-date backup, and that's perfect. The requirement is met.
The problem with the onboard-backup approach is revealed with the second, more subtle requirement: the saved data has to be usable. This makes sense; clearly, a backup is useless if the data can't be retrieved, right? But let's rephrase it: the backup drive needs to be even less likely to fail than the primary drive. So we need to ensure that whatever caused the primary drive to fail, it isn't also going to affect the backup drive. This is paramount.
Hard drive failure is most commonly due to mechanical wear of the drive components over time (it's true), which is a problem here because mechanical wear results from usage, and the two drives would be experiencing similar levels of usage if they were mirroring one another. Additionally, we have to assume that in a home environment failure-prediction technologies are not particularly useful because they're too easily ignored -- and besides, many home computers are multi-user so there's no guarantee that every user with data on the drive will know that there's a problem. The kids might not tell mom and dad that the backup is toast until it's too late.
Laying aside the issue of mechanical failure, let's consider other likely reasons for in-home hard drive suicide: electrical problems (lightning or blackouts), physical problems (dropping, hitting or getting the computer wet), or software problems (viruses). For this backup to really be "perfect" and part of a truly premium experience, it must to be able to restore data even in the face of these threats. And, it must be able to survive while not sacrificing any of the requirements we've already enumerated. This is a difficult problem.
Fortunately, I think I not only have a solution, but I have one that fits nicely into the current Apple Airport hardware family. It would be a standalone unit combining a hard drive and wireless router, it would replace the current Airport Extreme, and it would look a lot like the Airport Express. The idea is to combine a home's internet access point (the router) with the backup device, and I believe it would actually solve all of the problems mentioned above.
By putting the backup device into the home's internet access point, the backup device becomes accessible to anyone on the home network. This opens the door to a number of possibilities, most notably that the access point can immediately and continuously warn anyone and everyone connected that there's a problem with the backup. This is sufficient because any machine that backs its data up to the access point must connect to it every so often in order to perform the backup, so anyone affected by the backup's failure -- even people who aren't connected constantly -- are made aware of the problem pretty quickly. Additionally, wary parents can instruct the access point to restrict or lock down internet access if the backup goes south. This would pretty much guarantee an incentive to make the local administrator aware of the problem.
As far as invisibility goes, the backup would occur wirelessly every night (or as a background process during the day), and either way it would go unnoticed by the user. Since the backup data is only transferred between the single computer and the access point, upstream bandwidth is unaffected. Internal network bandwidth is affected to some extent, but chances are the major bottleneck is related to the upstream connection anyway.
This periodic-backup scheme also positively affects reliability -- the backup hard drive is not in constant use as is the primary drive, so its operating life should be significantly longer. In a similar vein, separating the backup drive from the computer decreases the likelihood that accidents involving the power supply or physical machine will affect the backup, and placing the backup in a separate, locked-down device should make it more resilient, in general, to malicious software.
And finally, by making the backup device a separate piece of hardware, consumers need not upgrade their entire computers -- probably inadvertently losing data in the process! -- in order to enjoy the benefits of a reliable backup. Marketing would love it.
This theoretical wireless router / backup device can be made using existing parts. The hard drive would be a low-speed, low-heat, low-power laptop drive -- possibly the 120GB 5400 RPM drive from the current Powerbooks. The rest of the hardware will be straight from the current Airport line.
This device would replace the Airport Extreme, and would retain its price of $199. The marketing would be, "The Airport. Now with secure data backup. Sleep well."
Sunday, November 06, 2005
There seems to be an increase in the number of "challenging" car designs lately. Perhaps born of renewed consumer interest in aesthetics and design, more likely a side effect of outsourcing-bred commodotization across various auto parts supplier markets, it's come to be recognized that polarizing designs are an excellent way to promote both brand image and conversation in general (which is great, since word-of-mouth advertising is the marketing orgasm du jour). And that's nice, to a point, because if consumers are paying attention to design, then brands can justify hiring design and humanities people to lead their desperate attempts to stand out in the shockingly incestual automotive business.
Unfortunately, because the motives behind the corporate interest in aesthetics are less than pure, the buying public is forced to endure designs intended not to provoke positive reactions, necessarily, but rather to provoke strong reactions period. BMW in particular has certainly received a lot of press regarding their questionable styling direction, and there's no particular reason to believe that they really think they're making beautiful cars. Perhaps they're instead making expensive conversation pieces. Even if that isn't explicitly the plan, we can be sure that it's at least crossed their minds.
If we accept that out-there industrial design is a good way to get consumers talking about "the brand" (eg, Ford) in abstract terms, as opposed to simply "the car" (the Ford Explorer) in concrete terms, then the Crazy-Ass Automotive Styling Explosion of 2001-5 makes a fair bit of sense. And indeed, most higher-margin, premium brands are attempting to tiptoe away from their more mainstream, risk-averse corporate overlords by piling on the jewelry and pouting for the camera. A strong brand image can clearly drive sales even in the face of severe product issues, so it makes sense to put lipstick on that pig if it raises awareness of the overall brand.
The problem with taking a brand's styling in a new direction is that it tends to piss off the, you know, existing customers. Styling is a double-edged sword like that: as much as it communicates fresh new values and a revised, hip sensibility, it also signals that the old message is on its way out. And that's a problem, because repeat buyers are valuable, and bad things can happen when they get unhappy.
So far, the solution to this quandry seems to be to retain a few beloved "heritage" styling cues across different brand reinventions. An example of one that works is the BMW twin-kidney grille (1933, 2006); an example of one that doesn't is the Nissan truck grille (1990, 2006).
Heritage styling cues are particularly important to the tuner arms of the German manufacturers, because they give fairly subtle body treatments to their cars, and the cues tend to be unique and critical to distinguishing the hopped-up cars from the normal ones. In the mid-90s AMG adopted an outward-splitting front valence and five-spoke wheels, and BMW M made the quad exhaust and hood vents its trademarks. These cues grew in prestige as the brands did, and became iconic symbols in their own right.
So, as you'd expect, when the Crazy-Ass Styling Explosion of 2001-5 took off, a popular maneuver was to adopt these prestige styling cues as one's own, often with ridiculous results. The most egregious single example, I believe, is Pontiac's, with runners-up Nissan and, inexplicably, Audi.
The situation we're amusingly left with now is that most manufacturers have just finished revising their model lines to fit with their outrageous, hey-talk-about-me new looks, only to find that their traditional styling cues, the ones with true heritage that resonate with existing customers, have been ripped off by the very brands they've been trying to get away from. They've ended up both throwing out their loyal following and losing the uniqueness of their brand heritage -- the very two things they were attempting not to do.
Unfortunately, because the motives behind the corporate interest in aesthetics are less than pure, the buying public is forced to endure designs intended not to provoke positive reactions, necessarily, but rather to provoke strong reactions period. BMW in particular has certainly received a lot of press regarding their questionable styling direction, and there's no particular reason to believe that they really think they're making beautiful cars. Perhaps they're instead making expensive conversation pieces. Even if that isn't explicitly the plan, we can be sure that it's at least crossed their minds.
If we accept that out-there industrial design is a good way to get consumers talking about "the brand" (eg, Ford) in abstract terms, as opposed to simply "the car" (the Ford Explorer) in concrete terms, then the Crazy-Ass Automotive Styling Explosion of 2001-5 makes a fair bit of sense. And indeed, most higher-margin, premium brands are attempting to tiptoe away from their more mainstream, risk-averse corporate overlords by piling on the jewelry and pouting for the camera. A strong brand image can clearly drive sales even in the face of severe product issues, so it makes sense to put lipstick on that pig if it raises awareness of the overall brand.
The problem with taking a brand's styling in a new direction is that it tends to piss off the, you know, existing customers. Styling is a double-edged sword like that: as much as it communicates fresh new values and a revised, hip sensibility, it also signals that the old message is on its way out. And that's a problem, because repeat buyers are valuable, and bad things can happen when they get unhappy.
So far, the solution to this quandry seems to be to retain a few beloved "heritage" styling cues across different brand reinventions. An example of one that works is the BMW twin-kidney grille (1933, 2006); an example of one that doesn't is the Nissan truck grille (1990, 2006).
Heritage styling cues are particularly important to the tuner arms of the German manufacturers, because they give fairly subtle body treatments to their cars, and the cues tend to be unique and critical to distinguishing the hopped-up cars from the normal ones. In the mid-90s AMG adopted an outward-splitting front valence and five-spoke wheels, and BMW M made the quad exhaust and hood vents its trademarks. These cues grew in prestige as the brands did, and became iconic symbols in their own right.
So, as you'd expect, when the Crazy-Ass Styling Explosion of 2001-5 took off, a popular maneuver was to adopt these prestige styling cues as one's own, often with ridiculous results. The most egregious single example, I believe, is Pontiac's, with runners-up Nissan and, inexplicably, Audi.
The situation we're amusingly left with now is that most manufacturers have just finished revising their model lines to fit with their outrageous, hey-talk-about-me new looks, only to find that their traditional styling cues, the ones with true heritage that resonate with existing customers, have been ripped off by the very brands they've been trying to get away from. They've ended up both throwing out their loyal following and losing the uniqueness of their brand heritage -- the very two things they were attempting not to do.
Thursday, November 03, 2005
Quake II was a big, big game for me. I bought it in high school, at the same time as two friends and fellow GLQuakers, and we quickly found colored lighting and Threewave CTF to be excellent additions to the first-person-shooter genre. Many nights were spent wandering the hallways of far-off planets, seeking out enemy flag carriers and gunning them down. Quake II multiplayer was not ahead of its time, but it was a fine example of carefully selecting common gameplay mechanics and polishing them until they fucking shined.
As a single-player game, Quake II was unfortunately not so hot. For the first time in an id Software game, the plot was not only a paragraph in the manual, but sort of integrated into the game. It was something about a war in space, or something -- the specifics were hazy. You played as a space marine, sent to another planet in order to destroy the enemy boss. But oh no! An accident occurs during your insertion, and you end up separated from your squad on a hostile alien world, alone but for your trusty sidearm (conveniently mapped to the "1" key; it's called foreshadowing).
The plot quickly degenerated into shoot-guy-flip-switch, as you might imagine, and the game went downhill fast. The aforementioned colored lighting, although cool and useful as a navigational aid, was not enough to elevate the pedestrian gameplay. id seemed not to realize that the priorities of an excellent multiplayer game are not the same as those of a single-player game, and in fact they often work at cross-purposes. Carefully balanced weapons are great online, but when you've slogged through three levels worth of space-jail and are rewarded with a new gun, it'd be nice to get a jaw-dropping moment the first time it's fired.
The single-player vs. multi-player disparity was eliminated in the next Quake game, Quake III Arena. Quake III was a multi-player game only, in the sense that the single-player game was still deathmatch, but against computer-controlled opponents. The company line was that this title bypassed "plot" and "immersion" in order to deliver the best goddamned deathmatch possible. Regardless, it had no story and did not attempt to further the Quake II plotlines.
Which brings us to the next game after Quake III, Quake 4. (Note, not Quake IV.) Powered by id Software's own Doom 3 (not Doom III) engine but developed by perennial id licensee Raven Software, this newest Quake title aims to blend the single-player focus of Doom 3 with the multiplayer focus of Quake III. Impossible? Indeed.
The single-player game in Quake 4 reprises Quake II as much as it continues it. The player looks through the eyes of Matthew Kane, an apparently mute space marine sent to the alien world of Stroggos and ordered to destroy the enemy boss. But oh no, an accident occurs during insertion, and he ends up separated from his squad, alone but for his trusty sidearm. From there, the game progresses in disappointingly similar fashion to every other first-person shooter ever made -- receive orders to turn something either on or off, shoot guys and press buttons until said thing is turned on or off, retrace through half the level, and receive new orders. The missions are repetitive and uninspired, to the point where some (the crate-stacking robot-arm puzzle, for example) are easily solved simply because most FPS players already solved that puzzle in every other FPS. It becomes a relief to enter a room and simply find some enemies, because shooting stuff is mercifully Quake 4's strong point.
Unlike a fair number of FPS developers, Raven Software (the ghostwriters of Quake 4) understand how to make shooting bad guys fun. They nailed the formula with Soldier of Fortune, a Quake III-derived first-person shooter released in 2000, and haven't strayed from it since. In fact, it would be accurate to describe Quake 4 as a cross between Doom 3 and SoF: the ultra-quick movement speed, solid-feeling weaponry, and liberal use of exposed intestines remind me very much of the latter, while the moody visuals and "just feels good" mouse input clearly indicate the presence of the former. Circle-strafing with a smoking shotgun just feels right in Quake 4, so if that's what you're looking for, this is the game. (Although, since this is the Doom 3 engine, there are rarely more than five enemies onscreen at any one time -- Serious Sam fans take note.)
Speaking of moody visuals, you might notice that the graphics haven't really been covered yet in this review. That's because they're great, just as you'd expect. Haze, sparks, and lighting effects are second to none, as you'd expect. Character models are disturbingly low-poly and nowhere near as compelling as Half-Life 2's, as you'd expect. Expectations are set and met in these areas, but nowhere is the bar raised in any way.
That statement extends to other areas, too. The sound is passable, as it was in Doom, with nothing really standing out. Same story with the AI -- shockingly it still blows, and in the same ways. The levels you will find yourself fighting through include a factory, a prison, a zombie-infested sewer, some towers, and a giant arena at the end, with bosses every so often that require lots of shooting at to kill. There are some on-rails vehicle sections, wherein you figure out how to control the vehicle just well enough to wish that the vehicle section would be over. The ending, too, is crap, and exists solely to set up for Quake 5. Quake V. Whatever.
Oh, and about the multiplayer: it consists primarily of maps from Quake II and III. This is true, I swear. They've rehashed past Quake multiplayer, but with the Doom engine. And it's uninspired.
In the end, the miserable failure of Quake 4 is its complacency and unwillingness to improve the series. Quake II and III were great because they introduced one or two new ideas while relentlessly refining the existing ones, making them more and more fun. By comparison, Quake 4 neither introduces any new concepts nor refines the existing ones, leaving the player no better for having experienced it. I guess there's no real reason to innovate when the Quake name guarantees sales, but I wish that Raven would've tried to advance the art a bit since they knew their effort would end up in front of a lot of eyes. The dearth of creativity in FPSes right now is kind of depressing, and I used to look to id to raise the bar even a little bit with each new title. But even I can draw a trend line from Doom 3 to Quake 4, and there's nothing new going on here. Better just to replay Half-Life 2 instead.
As a single-player game, Quake II was unfortunately not so hot. For the first time in an id Software game, the plot was not only a paragraph in the manual, but sort of integrated into the game. It was something about a war in space, or something -- the specifics were hazy. You played as a space marine, sent to another planet in order to destroy the enemy boss. But oh no! An accident occurs during your insertion, and you end up separated from your squad on a hostile alien world, alone but for your trusty sidearm (conveniently mapped to the "1" key; it's called foreshadowing).
The plot quickly degenerated into shoot-guy-flip-switch, as you might imagine, and the game went downhill fast. The aforementioned colored lighting, although cool and useful as a navigational aid, was not enough to elevate the pedestrian gameplay. id seemed not to realize that the priorities of an excellent multiplayer game are not the same as those of a single-player game, and in fact they often work at cross-purposes. Carefully balanced weapons are great online, but when you've slogged through three levels worth of space-jail and are rewarded with a new gun, it'd be nice to get a jaw-dropping moment the first time it's fired.
The single-player vs. multi-player disparity was eliminated in the next Quake game, Quake III Arena. Quake III was a multi-player game only, in the sense that the single-player game was still deathmatch, but against computer-controlled opponents. The company line was that this title bypassed "plot" and "immersion" in order to deliver the best goddamned deathmatch possible. Regardless, it had no story and did not attempt to further the Quake II plotlines.
Which brings us to the next game after Quake III, Quake 4. (Note, not Quake IV.) Powered by id Software's own Doom 3 (not Doom III) engine but developed by perennial id licensee Raven Software, this newest Quake title aims to blend the single-player focus of Doom 3 with the multiplayer focus of Quake III. Impossible? Indeed.
The single-player game in Quake 4 reprises Quake II as much as it continues it. The player looks through the eyes of Matthew Kane, an apparently mute space marine sent to the alien world of Stroggos and ordered to destroy the enemy boss. But oh no, an accident occurs during insertion, and he ends up separated from his squad, alone but for his trusty sidearm. From there, the game progresses in disappointingly similar fashion to every other first-person shooter ever made -- receive orders to turn something either on or off, shoot guys and press buttons until said thing is turned on or off, retrace through half the level, and receive new orders. The missions are repetitive and uninspired, to the point where some (the crate-stacking robot-arm puzzle, for example) are easily solved simply because most FPS players already solved that puzzle in every other FPS. It becomes a relief to enter a room and simply find some enemies, because shooting stuff is mercifully Quake 4's strong point.
Unlike a fair number of FPS developers, Raven Software (the ghostwriters of Quake 4) understand how to make shooting bad guys fun. They nailed the formula with Soldier of Fortune, a Quake III-derived first-person shooter released in 2000, and haven't strayed from it since. In fact, it would be accurate to describe Quake 4 as a cross between Doom 3 and SoF: the ultra-quick movement speed, solid-feeling weaponry, and liberal use of exposed intestines remind me very much of the latter, while the moody visuals and "just feels good" mouse input clearly indicate the presence of the former. Circle-strafing with a smoking shotgun just feels right in Quake 4, so if that's what you're looking for, this is the game. (Although, since this is the Doom 3 engine, there are rarely more than five enemies onscreen at any one time -- Serious Sam fans take note.)
Speaking of moody visuals, you might notice that the graphics haven't really been covered yet in this review. That's because they're great, just as you'd expect. Haze, sparks, and lighting effects are second to none, as you'd expect. Character models are disturbingly low-poly and nowhere near as compelling as Half-Life 2's, as you'd expect. Expectations are set and met in these areas, but nowhere is the bar raised in any way.
That statement extends to other areas, too. The sound is passable, as it was in Doom, with nothing really standing out. Same story with the AI -- shockingly it still blows, and in the same ways. The levels you will find yourself fighting through include a factory, a prison, a zombie-infested sewer, some towers, and a giant arena at the end, with bosses every so often that require lots of shooting at to kill. There are some on-rails vehicle sections, wherein you figure out how to control the vehicle just well enough to wish that the vehicle section would be over. The ending, too, is crap, and exists solely to set up for Quake 5. Quake V. Whatever.
Oh, and about the multiplayer: it consists primarily of maps from Quake II and III. This is true, I swear. They've rehashed past Quake multiplayer, but with the Doom engine. And it's uninspired.
In the end, the miserable failure of Quake 4 is its complacency and unwillingness to improve the series. Quake II and III were great because they introduced one or two new ideas while relentlessly refining the existing ones, making them more and more fun. By comparison, Quake 4 neither introduces any new concepts nor refines the existing ones, leaving the player no better for having experienced it. I guess there's no real reason to innovate when the Quake name guarantees sales, but I wish that Raven would've tried to advance the art a bit since they knew their effort would end up in front of a lot of eyes. The dearth of creativity in FPSes right now is kind of depressing, and I used to look to id to raise the bar even a little bit with each new title. But even I can draw a trend line from Doom 3 to Quake 4, and there's nothing new going on here. Better just to replay Half-Life 2 instead.