Kadin2048's Weblog
2017
Months
JulAug Sep
Oct Nov Dec

RSS

Fri, 18 Apr 2008

After reading about game designer Steve Gaynor’s bet about the art of interactive games in half a century, I wrote my thoughts in a MetaFilter comment. In it, I made my own prediction:

In 50 years, I fully expect interactives to be defying comparison to any other art or entertainment form, except maybe hallucinogenic drugs. Of course, there will still be “games” in the way we think of them today, because people like light entertainment and they’re fun. But I also think there will also be computer-mediated ‘experientials’ that involve you going into a room and sitting down, and coming out three or four days later wondering how bad the flashbacks are going to be.

I really don’t think it’s that much of a stretch.

Nobody would question today that a filmmaker at the height of his craft can provoke an intense emotional response from his audience; in fact, the ability to do so might be the greatest indication that a filmmaker is worth his salt. But really, a film is just a series of rapidly flashing still pictures accompanied by a pre-recorded soundtrack. It’s not interactive; if anything it’s impersonal: close your eyes or walk out of the room and come back, and it will have moved on without you. It’s just a recording. That we can be so emotionally affected by movies — brought to heart-pounding excitement or to tears — is a testament to our ability to concentrate on and immerse ourselves in artificial worlds through limited sensory input.

Interactive media have the possibility of being so much more than they currently are, which is just barely approaching the narrative depth of film. Most modern games — and yes, I know, there are exceptions; but most major-market ones in the US — trade on only a few emotions: largely fear, excitement, surprise, and anger. It’s an understatement to say that there’s a huge amount of headroom for improvement.

Future games could combine this creative, narrative latitude — the stuff of the very best film cinema — with the immersiveness and interactivity of games, and I think the results could be really astounding.

What’s missing, today, is the audience. It’s hard to make a big-budget game that doesn’t fall into a couple of well-defined categories (first-person shooter, maybe god-mode RTS, or open-world third-person explorer) and cater towards a market dominated by a young, male demographic. This is largely because older consumers didn’t grow up with video games, or grew up with games that were so primitive — arcade-style “twitch” games — that they don’t take them seriously as anything but momentary entertainment, and thus aren’t willing to spend the money to purchase a platform capable of delivering high-quality interactive entertainment. That’s something that will almost inevitably change in the coming decades, as people who have grown up with narrative games get older and push the boundaries. As long as Moore’s Law holds, the capabilities that a designer can bring to bear to tell a particular story for a certain amount of budget will probably only improve over time, as well.

Time to set a reminder to come back in 50 years and see how we all did.

0 Comments, 0 Trackbacks

[/technology] permalink

Just in case anyone thought that mind-boggling ignorance and gross stupidity was restricted to members of the U.S. government and civil service, this story out of Russia, reported by Ars Technica will disabuse you of the notion. Apparently they want to impose a mandatory registration and licensing regime on all consumer Wifi gear, under penalty of confiscation:

[T]he government agency responsible for regulating mass media, communications, and cultural protection has stated that users will have to register every WiFi-enabled device with the government […] registration could take as long as ten days for standard devices like PDAs and laptops and […] it intends to confiscate devices that are used without registration.

The Ars story references a Russian source, Fontanka, but it’s (unsurprisingly) in Russian.

Although it’s easy to go for the censorship-conspiracy angle, I’m not sure that there’s as much evidence for that interpretation as there is for plain old public-sector incompetence:

The Fontanka.ru article quotes an industry specialist who points out that the government agency behind the policy is run by a former metallurgic engineer who likely has no clue about many of the technical issues overseen by his organization.

It’s almost heartwarming, how much we have in common.

0 Comments, 0 Trackbacks

[/politics] permalink

The Financial Times has a very interesting article on the relationship — or in this case, lack thereof — between population growth and prosperity. It astounds me a little that any of their findings would be surprising to a first-worlder in 2008, but I’ve heard enough people lament the population decline in Japan and Western Europe that this obviously isn’t the case.

There are two important lessons here. One is that we should always look at per capita, rather than overall, production when measuring the success or failure of various economic policies. Any policy that produces a higher GDP at the expense of a lower per-capita figure is stupid, since it’s the per-capita figure that’s linked most intimately with standards of living. Lesson two is that policies that are based on continuous population growth just aren’t sustainable, and we need to get rid of them (or at least rethink them) before we hit the inflection point and they become untenable. What we need not to do is view the population decline itself as a problem, because it’s not. It’s taking population growth as a given that’s the mistake.

Countries with declining populations, or with populations that may begin to decline soon, have a unique opportunity to consolidate standards-of-living gains and create new social structures that aren’t predicated on pumping out offspring (and consuming non-renewable resources) by the bushel-basket. This is nothing but good for people living in those areas, provided the transition is managed thoughtfully.

0 Comments, 0 Trackbacks

[/politics] permalink

Wed, 02 Apr 2008

COBOL: (Synonymous with ‘evil’.) A weak, verbose, and flabby language used by card wallopers to do boring mindless things on dinosaur mainframes.

[from the Jargon File]

Given many C and LISP hackers’ opinions of COBOL, it’s perhaps unsurprising that it’s one of the least-mature languages on Linux. While C has a compiler (gcc) that rivals some of the best commercial implementations, I’ve had nothing but frustration so far as I’ve tried to get a working COBOL compiler running.

There are, as far as I can tell, two COBOL compilers that seem like they might be useful for basic testing and development work: TinyCobol and OpenCobol. TinyCobol compiles COBOL and produces GNU Assembly, which are translated into machine code by the GNU Assembler; OpenCobol translates COBOL to C, which is then compiled by gcc.

Not being a C programmer — meaning that one of the benefits of OpenCobol, the ability to debug your COBOL program in C, wasn’t particularly useful to me — I decided I’d give TinyCobol a shot first.

Although there are references around the ‘net to binary packages of TinyCobol, there wasn’t any evidence of one for Debian on TC’s website. Hoping — perhaps naively — that something called ‘Tiny’ wouldn’t be too much of a bear to compile myself, I grabbed the sources and dove in.

Although I didn’t have any problems in configuration, as soon as I went to run ‘make’, the errors began. The crucial one seemed to be:

 gcc -I/usr/include -I/usr/local/include -I../lib -I../ -c scan.c
 scan.c:1062: error: syntax error before 'YY_PROTO' 
 scan.l:122: error: syntax error before 'switch'
 ...

After that, things just fell apart. I played around with it for a few hours, trying and retrying, checking all the dependencies, but to no avail. I even went so far as to try it on a brand-new Dapper installation running in a VM, just to make sure something about my system wasn’t poisoning it. Nope.

So after giving up — at least for the moment — on TinyCobol, I decided to give OpenCobol a try instead. Although OpenCobol apparently has a package for Ubuntu Edgy, there’s currently no backport to Dapper, so I was left again with the unappealing alternative of building it myself.

I got the OpenCobol sources and its dependencies installed easily enough, and ran the ./configure script without problems. Looking good so far. But as soon as I typed ‘make’ and started to actually build it, I was filled with a little deja vu:

fileio.c:308: error: syntax error before 'DB'

Followed by several pages of ‘incomplete type’ errors. So much for that. A quick Google for the error didn’t reveal anything, and since I’m not a C programmer, that’s pretty much the end of the line. (There’s a reason why I normally have a blanket rule against any non-trivial software that requires compilation. The number of times I’ve tried compiling some large software package and actually had it work without deal-breaking problems is very, very small.)

I’m tempted to take this as some sort of cosmic sign; the revenge of all those scoffing C and LISP greybeards on their COBOL cousins. Linux — at least my Linux machine — just doesn’t seem to want anything to do with it.

Anyway, should anyone else out there find a way of running TinyCobol, OpenCobol, or some other COBOL compiler on Ubuntu Dapper (before it goes out of support and I’m forced to upgrade anyway), I’m all ears.

At the moment I’m torn between just giving up completely on Linux for this purpose and looking for a working COBOL implemetation for Win32, and feeling like since I’ve already put a day’s worth of work into this, I ought to keep banging on it and see if I can get either TC or OpenCobol working on Ubuntu Edgy or one of the other new versions. I think I’ll probably start downloading a new Ubuntu LiveCD while I look for Windows tools, and see which one I get working first.

0 Comments, 0 Trackbacks

[/technology] permalink