Wolfram|Alpha: Systematic knowledge, immediately computable.

Monday, December 27, 2010

COD:Black Ops - One of the best war movies I've seen recently...

Now past the billion dollar sales point, Call of Duty: Black Ops, the seventh game of the series, and the third by developer Treyarch, has been a huge success, At least sales wise. But what to make of the game itself?

I preordered the game, based on my enjoyment of several of the past games in the series. Not something I do often - too many recent games have turned out to be turds in a punchbowl, so I now wait until I've heard from my 'buy everything' gaming pals, or get some hands-on time with a new game before shelling out my shekels.

I figured the online MP experience would be plagued with issues on release (it was), and did not even bother to install the game or play it on release day. I ended up so busy playing other games, I'd not gotten around to it until recently.

One of my constant gaming buddies got the game as a gift, and I watched a bit of his game play in single player. It sparked my interest, so here's my nutshell review.

First, a note on my biases: I am mostly interested in single player these days. I retired from competitive gaming, and pubbing is filled with far too many cheats / hacks / noodle-heads to have much fun. I limit my online play to games where my close gaming cohorts are the only other players. They're all good enough that cheating is out of the question for them.

I fired up the single-player campaign, to be met with the typical, long (looooooonnnggg) 'intro' video that the player is forced to watch before getting into the action. Ugh.

And therein lies the crux of my complaint with this game. Far too much of the game is cut-scenes, most where the player is unable to skip viewing. F.E.A.R. got this right: you had to watch such scenes on first play, but after that, you could elect to skip them. Added to this, some of the scenes are ridiculously long. Seriously. One of them is so long, I'm quite certain I could cook myself a nice breakfast, and eat it, before the scene is over.

A few of the scenes can be skipped (too few, in my opinion) but sometimes it's with a mouse click, sometimes with a space bar hit. What? Did anyone Q/A this thing?

To add insult to injury, at the conclusion of one cut-scene (one of those, like in a lame movie, where they run out of time / budget, so they end with the "it was really a dream in the mind of the dog owned by the sister of the neighbor that lived behind the house where the ghost of the axe murderer was seen by the uncle of the bank president that loaned the money to the owner of the pet shop the dog came from in return for sex with what turned out to be his long-lost sister"...you get the idea), you as the player have to endure a long sequence of slow-motion navigating of a government building, interspersed with more cut-scene flashbacks, where your only action is to move your avatar through what feels like air made of clear gelatin. Slow, slow, slow, and boring as hell.

If that kind of 'play' is not irritating enough, there are plenty of cut-scenes that are long enough to put you to sleep, only to find that the instant the cut-scene ends, you were supposed to press some key, or click some mouse button, to rescue your avatar from certain death.

Even when you are actually playing, there are far too many places where you must follow an NPC, and you are strictly limited to a narrow corridor of play, with no way to 'pass' the NPC, as if they had a ten foot wide Plexiglas sheet strapped to their back. More of an 'interactive' cut-scene than real game play.

That aside, the campaigns are challenging enough, particularly on the highest difficulty levels. But there's nothing really fresh here.

If you've played past recent releases of the series, you've played this game already: shoot a bunch of baddies, blow up this and that, get this or that important thingy, cut some wires, slap some fools around in the name if interrogation, etc. Same game, different scenery.

That graphics are fine, actually quite good: humans look and act pretty human, the AI of the NPC (friendly and enemy) seldom does anything really stupid, though you certainly won't be fooled into thinking there's anything human about them. Avatar movement is also pretty well done.

A nice assortment of weapons is available, so you mostly get to choose how to accomplish your slaughter to your liking.

The full single player game is rather short (I'd guess 4-6 hours for a player familiar with the series), and leads you to the best part of the game, in my opinion: a very well done, very funny scene with banter between several luminaries from the times of the game, resulting in battle with Nazi zombies. The Zombie mode of the game provides a challenging and fun game, sure to please fans of games like the past COD Zombie modes or Left 4 Dead.

Overall, the game feels, to me, like one where an epic, detailed game was planned, that might have provided 20+ hours of single-player involvement, but someone said "No way!", and too many key elements were turned into arduous cut-scenes.

The game, apart from that, feels lovingly made. But, I wouldn't eat a lovingly made shit pie, nor would I recommend it to my friends. Fans of the series will find it a must play, just as fans of Star Wars will buy every DVD in the series despite some of the films being turkeys.

Really, if the developer / publisher continue on this route, why not just make the next game in the series a theatrical release, where moviegoers get a fake game controller, and can pretend they're 'playing' the new COD?

B- on a good day, C+ on days when I'm grumpy.


Monday, December 20, 2010

The Father of Modern Gaming: An excellent profile of Shigeru Miyamoto in The New Yorker.

If you're a gamer, you owe it to yourself to acquire a copy of the December 20 & 27 2010 issue of the magazine The New Yorker.

An excellent, lengthy article on Shigeru Miyamoto, arguably the father of modern gaming and gaming consoles will be found starting on page 86. A delightful read, even if you are not a console gamer.

If you are unable to peruse a physical copy, the complete article is available at the time of this blog entry's writing at The New Yorker Online.


Sushi For Your Gaming: Hamachi VPN for LAN multiplayer over the internet.

I've recently been playing some older games (Rainbow Six series) in co-op multi-player mode with some gaming buddies. An issue often faced when playing online, particularly with older games, is the availability of the various servers (log-in / authentication / etc.) from the publisher to enable this mode of play.

Sometimes, the servers are unavailable for maintenance or other reasons, sometimes they've been completely shut down when the game is at "end of life" by the publisher. What's a gamer to do that still enjoys playing with geographically dispersed friends for games that fall into these holes?

So long as the game incorporates a LAN multi-player aspect, this can be easily accomplished with the most excellent Hamachi VPN from LogMeIn.

In a nutshell, Hamachi provides a painless, zero-configuration (95% success) Virtual Private Network facility ideally suited to online gaming (unlike a typical VPN, Hamachi will pass broadcast and multi-cast traffic, vital for proper functioning of most LAN games.)

Non-commercial use is free, setup is quick and painless. The service provides a web-based interface to create networks, manage clients, and deploy customized client installers. The latter facility enables players with no technical expertise to easily become a member of your private gaming network without any Hamachi configuration - just a simple installation is required.

Mesh (all members can 'see' all other members), Hub-and-Spoke (Members are connected to hubs, hubs 'see' all members, members see only the hubs), and Gateway (members 'see' the entire physical network, and share its address space) networks are supported, Mesh being the most appropriate for most WAN based LAN gaming.

A really fantastic free service with very respectable performance and reliability, and security (encryption of data flow) when desired.

Its use is of course not limited to gaming: it makes a fine VPN for accessing your LAN while out on the road for general use.

A few caveats for gamers (these apply to the 'host' or originator of the network, and the clients / members):
  • Most games will use the 'first' LAN network adapter found on the machine for traffic. You must ensure that the Hamachi 'adapter' (actually a virtual adapter installed by Hamachi that is treated by the operating system as a real, physical adapter) is 'first' in the list. Failure to do so may result in an inability to connect the participating PCs together in the game (clients may not 'see' the server / host, etc.) You can adjust this by going to the advanced settings in your network connections control panel window (see image below), and using the arrows next to the connections window on the adapters and bindings tab. Move Hamachi to the top. You might need to reboot your machine for the changes to take effect.
  • Recent versions of Windows (7, and probably Vista though I've not verified this) have changed the behavior of broadcast traffic routing. This one had me head-scratching for a bit! Before, such traffic would be sent on the adapter used for the connection. In recent Windows, this order is ignored, and the adapter with the lowest interface metric will be used, regardless of its order in the adapters connections list. The result is symptoms similar to the adapter itself not being seen as 'first': clients will likely not see the server / host, or not be able to connect successfully. To remedy this, right-click on the Hamachi adapter in your network connections control panel window, select properties, double-click on the Internet Protocol Version 4 item, and click on the advanced button in the lower right of the dialog. Set the Interface Metric to 1 and reboot your machine (you can also try just disabling and re-enabling the Hamachi adapter in lieu of rebooting.) The default metric used by Hamachi of 9000 causes broadcast traffic to route over your 'real' NIC, causing connection issues for the games. Network internals savvy users may recognize there are other means to accomplish this end. Feel free to experiment - you won't permanently break anything.
Give it a look and a try. As long as a game supports LAN multi-player, you will always be able to enjoy it with friends in the cloud using this nifty tool.


One of the funniest persons I've read.


I laugh hard every time I visit.


Sunday, December 19, 2010

Aimbots In Real Life: DARPA's Advanced Sighting System.

Thanks to a SpecOps buddy for ringing me up with information that pointed me to the sources for this post.

One of the biggest banes of snipers is wind. It is difficult to estimate speed and direction of winds between the shooter and a potential target, and not unusual for there to be currents and eddies going many directions and speeds along the route of the bullet. This can cause gross inaccuracies in the shots: even under just 10 MPH crosswinds, accuracy of snipers is seen to drop dramatically at ranges over 300 meters.

DARPA (Defense Advanced Research Projects Agency) put out an RFP some years back for a next-generation sight system to enhance the capabilities of snipers. Known as the Advanced Sighting System (later simplified to the "One Shot" system), this technology will soon be in the hands of test soldiers, with a contract awarded to Lockheed Martin for the initial units. Lockheed Martin had been the winner of the original early prototype contract in 2008, where units showed a twofold to fourfold increase in first shot hits, and a halving of target acquisition time.

The units utilize a ~50μJ, 10ns pulsed laser to 'slice' samples of the air between the shooter and the target (getting the exact range to the target for 'free' in the process.) The aerosol back-scatter of the slices is analyzed by a sensor (either an array, or a single sensor with the laser placed in multiple sample positions per slice) to produce a time cross-correlation. This correlation of the scintillation provides information about the air movement at that slice. The system then calculates the resulting ballistics corrections, and the data is used to 'zero' the reticule for the shooter.

The benefits are myriad, chiefly:
  • No need for a spotter to guess the range.
  • No guessing of wind velocity, direction, variation and thermal effects down range.
  • No math calculations for spotter or shooter, nor manual adjustments.
  • Vastly reduced sniper training requirements.
  • <1 second from target acquisition to trigger pull.
It's an aim-bot for real life.

Boom. Head shot. At 2000 Meters.


Details of the award can be found at the article in Military & Aerospace Electronics.

Thursday, December 16, 2010

On the subject of rants...

Here's a good one, by Jeff Roberts, a developer for the hugely popular RAD Game Tools. It discusses aliasing in C (more precisely, problems that strict aliasing rules in compilers cause that screw up his code results).

Strict aliasing can result in large performance benefits via compiler optimizations, but it can 'break' otherwise valid code (even the Linux kernel doesn't follow the rules: see lkml.org for a post by Linus Torvalds regarding the insanity.)

The original post from Jeff titled Strict Aliasing... makes for amusing and illuminating reading, as does the rest of his blog. I think he and I would get along swimmingly.


Oops!BSD? We might need to re-evaluate our beliefs in the security of OpenBSD.

OpenBSD is widely hailed as the most secure of the generally available non-hardened operating systems (that is, excluding operating systems designed to specific security constraints, such as GEMSOS, and LOCK).

A recently released email might change this view, if found to not be a hoax.

See http://marc.info/?l=openbsd-tech&m=129236621626462&w=2 and prepare to don your conspiracy hats...

As an aside, I can highly recommend the book "Operating System Security (Synthesis Lectures on Information Security, Privacy, and Trust)" by Trent Jaeger for an excellent review of trusted operating systems to the interested reader.


Bailing on BASH, forum idiocy, Et cetera.

It has been a long time since I had a rant post. Or any post, for that matter: Busy with other projects, my daughter (accepted into Stanford early admission, congratulations JC!), and other miscellanea. So, if you're easily offended, best skip this one...seriously. For that matter, if a rant itself will bother you, just skip reading. This is not the typical post here (I think I've really ranted only once before.) You have been warned.

Ramming Speed! (nod to Ben Hur...)

Anyone that has followed me in forums (or knows me personally) knows that I'm a thermonuclear bomb directed at stupidity / unctuousness / marketing snake oil. I really do think the world would be a better place if more took aim at such nonsense: embarrass most fools enough, they'll go away, leaving a better quality pool in the end.

I directed readers to the excellent forum and podcasts on BashandSlash many posts ago, and have been a lightly active poster there since. A recent podcast, however, raised my hackles with its fulsome (not in a good sense)  interview of a 'community representative' for DICE. Simply put, no hard questions were asked, and the interviewee  exhibited the (unfortunately all too common these days) sliminess of a used car salesman. I said it in a forum post there, and I'll repeat it here: Anyone that can fling BS in the name of their 'job' is a perfidious douche nozzle.

And letting them get away with it is a disservice to the PC gaming community. It lowers the bar yet further. I may be Patton to the host's Gandhi, but really, if all one's journalistic intent is is to be a conduit for the game publisher's advertisements, why bother?

Now, in my post (I'd link it, but it appears the thread is gone (from an edit updating the content, not from any kind of 'get rid of his post' action)), I replied to someone that contrasted this 'community representative' to a podcast of a different sort (Crosshairs), where recent shows have had the most interesting John Gibson from Tripwire Interactive as a guest.

John's honesty and openness is a refreshing change from the marketing BS, and outright lies, that the PC community has seen from several of these 'community representatives'. (As an aside, just who do they represent? The cowering, sugar-coating PC public that will take the recent crap labeled as games and eat it with a smile? They certainly don't seem represent the best interests of me, or any of my gaming friends...)

My reply contained the quote above, along with the statement (contrasting the typical 'community representative' with John Gibson) "Most serious gamers would more likely punch lame-toe and IQ_20.20 in the face if they met them..."

Somehow, this was twisted into a 'threat of personal violence' (by my hands) toward one 'Robert Bowling' by the host. Huh? I never used that name, nor have I ever had any interactions with or about that person. Beats me. Maybe Canadian English works differently than mine. In any case, I was issued a warning in the forum by the host. A perfervid response, I think.

Last straw for me: I will not walk on eggshells, much less imaginary ones, for people I don't even know (one Mr. Bowling), so I asked to be immediately removed as a member. No milquetoast clubs for me.

I want to be clear - I have no intrinsic qualms with the host (Jockyitch) - he seems like a straight shooter, and it is his board / forum / podcast, and his rules. I simply think being Mother Teresa as a journalist does us in the PC community little good in putting pressure on the developers and publishers to reverse the rapid downward spiral of the last few years in PC game quality and their rush to powder the hineys of the console crew. It is the action I have an issue with: it results in silencing dissent that further enables the marketing BS behaviors of developers and publishers with no benefit to PC gamers.

Squelching views not to one's liking, or not to one's tea-and-crumpet politeness standard has the same chilling effect. Worse has been said directly at the same interviewee in their very own forums where they have moderator status. To their credit, such posts remain, excluding of course seriously out of line "I'll kill you if I find you" kind of posts. Not that I need that to justify my sentiments to myself: Marketing BS pawns are the plague of the earth. Well, at least one of the plagues.

Spirited, sometimes harsh, occasionally rude interaction is just a part of free speech.  I myself have been guilty of less than parliamentary politeness in interchanges with idiots (see below). Then again, maybe that's a poor example of a standard - have you ever watched the verbal volleys fired in the House of Commons?

Saying someone might punch someone is not a personal threat of violence, it is a metaphor for the level of displeasure many PC gamers feel.

At the same time, I advised the host of the companion show Crosshairs of this action, and that I should probably not participate in future podcasts: there is no way I can interact like some sodium thiopental laced patient, bleary-eyed with a drool festooned grin mouthing "Oh, yeah, Isn't it all just grand!" because it 'fits' the peace-loving agenda of the host. Or in this case, actually, the host's parent host: Crosshair's host Iblleedv20 can be  a contrarian sometimes: he is just much more civilized at it than me.

I will miss being a member of Crosshairs: some very cool, very smart people are regular participants. I highly advise that you take a listen, if you haven't already, and also to the parent Bash casts.

I wish both hosts the best in their endeavors.

Rant, part Deux: Forum Stupidity.

For Christ's sake, is it that hard, if you disagree with someone in a forum post, to at least know what you're talking about before posting a retort? Anything else just spreads misinformation to the naive reader, and makes the reply look like that of a fool to those that do know what they're talking about.

Case in point, one of the last threads in the forums involved, having to do with the miserable sound quality of a game soundtrack demo. You can see the thread here. Now, I'm pretty sure a monkey could discern that the 'strings' are fake. And poor ones at that. Like a 'real' Ming vase found in a box of Cracker Jacks kind of real.

I stated such. This inane response followed (to which I did not reply - my account was deleted per my wishes):

Clearly you are not a musician.

1. A high school orchestra wouldn't be professional enough to tackle a full soundtrack. They'd come in late, they'd come in flat or sharp, the string section would have tuning issues.

2. What I've heard here sounds nothing like a $29.99 Walmart keyboard. I have no idea what you're talking about. You might need better speakers or headphones, though.

3. No one uses a "synthesizer" for orchestral music production anymore. Professional composers use samples, recorded off of REAL instruments. Synthesized implies that the sound is created synthetically, mathematically. I am fairly certain what is being used here are samples, at least from what I can hear. Actually it sounds pretty convincing to me, I'm not sure what you're listening to, Rob.

 Let's take this piece by piece, and see why I think it's a pretty stupid response (and unfortunately typical of self-proclaimed experts in all too many forums.)

 Clearly you are not a musician.:

And the relevance of this is...what? This is a typical tactic of the weak-minded / unarmed with facts poster. A type of straw man argument (fictitious persona) to attempt to cloud the reader's sense of the real argument. Of course, this person knows nothing about my musical background. Even if they did, it has no relevance.

Audiophiles have long noted musicians having really terrible audio systems that 'sound fine' to the musician. Some have theorized it is due to the constant exposure to real instruments allowing those listener's brains to compensate for the poor sound, though I know of no academic study ever done to test this. I can speak from personal experience - I've seen this to be the case often with musician friends of mine, one particularly excruciating system belonging to a world-renowned conductor. If anything, not being a musician might be an advantage based on these observations.

A high school orchestra wouldn't be professional enough to tackle a full soundtrack...:

And your evidence is? You were in a crappy one, perhaps? Anyone that has heard a truly outstanding high school  orchestra (and they do exist) will spot this as ridiculous.

What I've heard here sounds nothing like a $29.99 Walmart keyboard. I have no idea what you're talking about. You might need better speakers or headphones, though.:

This kind of reply content toward me, or others, particularly tickles my funny bone. Again, a fictitious persona is created, and the implication is made the the replying poster's argument must be correct because he/she has 'better equipment', or is otherwise better 'equipped'.  You never know when you might be addressing someone that has headphones worth more (and of higher quality) than your entire system. Or speakers worth more than your house, with concomitant quality. And in this case, ears to match.

It should be noted, this poster, nickname "golden_ear" (a self-labeling of many that think they have magical abilities in listening skills), joined and posted their one and only post supporting a prior reply to me in the thread by poster "X", nearly immediately followed by one of those "oh, yeah, you are soooo right!" posts by the same poster "X". Please, if you're going to resort to using Internet Sock Puppetry, try and do a better job of making it look legit...

No one uses a "synthesizer" for orchestral music production anymore. Professional composers use samples, recorded off of REAL instruments. Synthesized implies that the sound is created synthetically, mathematically. I am fairly certain what is being used here are samples, at least from what I can hear. Actually it sounds pretty convincing to me.

Now, here's where the poster shows how clueless they are. Anyone knowledgeable about music synthesis of course knows that sampled / multi-sampled synthesizers are still synthesizers. A quick ring to Ray Kurzweil, the father of modern synthesizers that can fool many will confirm the correct terminology - don't take my word for it. As for it sounding pretty convincing? Well, dear reader, take a listen to the audio in the referenced thread. I would personally be stunned if more than 20% of listeners could not immediately discern the 'strings' are not real. As for the alleged, self-proclaimed musician(s) in the thread (I qualify the plurality in light of the likely sock puppet involved) that are convinced by it - perhaps a different line of work might lead to better success.

Ahhh, I feel better now.


Wednesday, December 15, 2010

I Hear You!: A fantastic DAW (Digital Audio Worstation) application previously unknown to me.

It's been a while since I've done any serious mixing work, Cubase and Acid being my workhorses. A recent conversation with an audio engineering buddy pointed me to Reaper. In a nutshell, after a few days of experimentation: what a fantastic piece of software!

This is a serious piece of software, a worthy replacement in most cases for Cubase / Acid / et alia (at least for my needs), and at a ridiculously low price for professional use. The non-professional license price is so low, one might feel like 'donating' a few extra shekels to the developer in appreciation for a great product, with a no-nonsense licensing scheme.

Well done, Justin Frankel and Cockos!

I See You!: How clever optics are letting us image planets around other stars.

A quick note on an interesting and very accessible paper titled "An apodizing phase plate coronagraph for VLT/NACO" that discusses the emerging technology used to allow direct imaging of remote planets that would normally be difficult at best due to the (relatively) overpowering brightness of the planet's parent star.

The meat of the result can be seen in the point spread function graph, lower right, on page two: note the distinct reduction of the Airy disk and associated diffraction pattern on the right section of the graph.

Check it out at: An apodizing phase plate coronagraph for VLT/NACO on arXiv.org.

Wednesday, October 27, 2010

Schannel Event 36888 ( 10 / 10 ) When Playing BFBC2 / MOH / Etc. - WTF?

Since the beta of EA/DICE's Battlefield Bad Company 2, forums have had a myriad of posts with game problems, with this symptom included in the posts. Since retail release of the game, and the subsequent release of Medal of Honor by the same publisher/developer teams, the same has been seen for the latter game.

Specifically, the event log entry in the windows system log is:

Event 36888, Schannel
The following fatal alert was generated: 10. The internal error state is 10.

When I first saw the error myself, I recognized it from my network programming days as an informational error, indicating some kind of barf-o-rama on the server side of  a secure connection handshake. Unlike most of the other Schannel event IDs, this particular one seems to remain undocumented. Nonetheless, the Info opcode and 10 / 10 Alert Description and Error State hint strongly at it being server side.

Since it seemed to have no material effect on the playability of the game(s), my interest in investigating it stopped there. A recent poster, however, indicated that disabling their AV (Trend) caused the apparently related game issues to be remedied. While it appears that the game itself runs correcly despite encountering the Schannel error, it may be that some A/V that muck with everything on the wire might take drastic action in the face of it. Strange if some do, but plausible.

In any case, barring some other application / utility causing problems (e.g., said A/V), the error itself can be safely ignored. If it really bothers you, you can change the logging level via a registry change by modifying (or adding if needed) the key:


DWORD value EventLogging with a value of 0 will eliminate such event log messages. Note that current versions of windows seem to be more voluble for these errors - on older (e.g. XP), the error may occur without a log entry being generated.

I became interested again in this event / error recently while tracing the traffic activity of the game debugging a separate issue. Both games are built on the same engine / network infrastructure, so it is not surprising they share the same frailties.

From an outsider's view (since I have no access to the game source code, nor the EA master or game servers, my view must be the result of probing and testing theories, using debuggers and traffic sniffers), the network infrastructure for these games is a bit of a wreck. In the same way one might surmise their neighbor is a slob from observations of the trash bags thrown on their front lawn, the mishmash of connections and traffic these games generate is appalling. The possibilities of problems due to one piece failing or being unavailable are surely a source of grief for many attempting to play these games online.

If this system was designed this way from scratch, someone should be publicly whipped with a length Ethernet cable. If it is the result of 'evolution' of features and functionality by adding servers to the 'master' pool, the time has come perhaps for EA to rethink the infrastructure and rebuild it from scratch.

In any case, the Schannel error in these games appears to be generated by an improperly configured EA server that provides them with hardware information à la Steam's hardware survey.

Another way to eliminate the error (and stop spying by EA, if that's your stance), is to add the following to the file \windows\system32\drivers\etc\hosts:        bf1943hwtelemetry.ea.com

This prevents the game(s) from even starting the handshake process, short-circuiting the error path.

In summary: The error is harmless, it is not the cause of crashes / etc. in the game itself per se though it appears it might throw programs such as A/V into a tizzy (when I feel like it, I may investigate this further.) You can just ignore it, or if it bothers you having it in your event log, take one or both of the steps outlined above.

Friday, June 25, 2010

Where's Rob?

I've been getting a surprising amount of e-mail asking what's up with the lull. Fear not, I'll be actively posting again, but as I said in an earlier entry, I've been busy with other things. Perhaps I should add 'irregular' to the banner - might also increase my audience of readers with bowel problems when they Google for cures...

In case you wonder, I've been busy with:

1) Summer. My High School junior daughter is at Stanford doing research for the whole summer, cutting down our face time a lot. So any real spare time I have that matches her schedule, I'm there with her.

2) New games:
  • Just Cause 2 (micro-micro-review posted).
  • Sniper Ghost Warrior : So far, so not good. I had very high hopes. I'll do a micro review when I form a final opinion.
  • Portal Prelude; Very tough mod of Portal. Marred by inane dialogue, but some seriously hard and advanced portal use. If you have portal, check it out.
  • BFBC2, Splinter Cell Conviction, COD: MW2: Replaying these using three displays in ultra-wide screen mode via eyefinity. Pretty immersive.
3) The big time sinks:
  • Understanding Information Transmission (IEEE): A high-level overview of many facets of communication and information theory. Used as a textbook by many schools. Not super deep ( I own dozens that are seriously deep in my library of ~1500 technical books), but a good addition to my 'loaner' shelf - books I loan to interested friends that want to get a basic background and to stimulate them to pursue deeper studies. Hideously expensive at $65.00 for a slim paperback, but I've not seen a better overview to date.
  • Two volume set of  (The Tenseless)/(The Tensed) Theory of Time: A Critical Examination by W.L. Craig. Traipsing from Special Relativity to A/B-Theories of time, a philosophical work that demands hours of hard thinking for each page, to answer "What is time, is it even real...". Fun.
So, when I'm back, I'll publish the pending articles and some new ones. Thanks to the many that have suggested subjects - do continue to do so!


Wednesday, June 16, 2010

How long until OnLive is OnDeathRow then OnDead? A µ post.

The OnLive gaming service supposedly goes 'live' tomorrow. As I've said in the past, this is destined for an 'epic fail' in my opinion, perhaps to rise from its future ashes as an acquisition by a google, netflix, or some such ilk as a mechanism for streaming movies perhaps, and very simple games. As a medium for FPS games, it requires a leap of faith of Star Trek proportions to believe it will service serious FPS gamer's needs.

See the most interesting article Why OnLive Can't Possibly Work for a sane and reasoned critique.

Now, back to some real gaming...

Wednesday, June 2, 2010

Just Cause 2. Get it.

The blogosphere has been quiet here: I've been busy with upcoming posts, hanging out with my daughter, and playing a new game. Just Cause 2 by Avalanche Studios, published by Square Enix / EIDOS.

Take Far Cry (the original, not the über lame second version) with the wisecracking Jack Carver mixed in with a little Antonio Bandares ala "Once Upon a Time in Mexico", throw in some Grand Theft Auto, and fold in some S.T.A.L.K.E.R. Shadow of Chernobyl missions, factions, and goodie finding.

All played out on a completely open (no loading when moving around the map) 400 square mile chain of islands navigable by foot, swimming, boats,cars, motorcycles, helicopters, commercial jets, fighter jets, private jets, tuk tuks, etc.

Those elements alone would make for a pretty good game, but when you add in a humor component that many times had me laughing so hard I died (in game) because I couldn't see the screen, and you've got a sure fire winner.

I won't even bother with any kind of micro-review, it wouldn't do justice to a game that is rapidly heading up my chart of personal favorites.

If you liked any of the games mentioned above, get this one. You won't regret it.

Wednesday, May 19, 2010

Nim Chimpsky Plays A Game: A Simple Game Played With Simple Recursion

Nim Chimpsky, named as a pun of Noam Chomsky the brilliant linguist, cognitive scientist and philosopher, was a chimpanzee that researcher Herbert S. Terrace of Columbia University claimed had learned a form of sign language and was able to communicate intelligently. Whether or not this was in fact true is still hotly debated in the research community.

If one believes that Nim had the intelligence to learn and use language, one might think the chimp could play one of the simplest of children's games, also called Nim. The game of Nim is a game of extensive form and perfect information that comes in many flavors. We will be using perhaps the simplest form, often called the subtraction game. This version consists of  two players starting with a number (usually represented as a pile of objects like coins or pebbles), where each player in turn removes one, two, or three objects. The player to take the last object is the loser.

It can be trivially shown that such a game reduces to a player avoiding being left with n 1 (mod 4) objects, in other words, a player should try to remove enough objects to leave the opponent with 1 (mod 4) objects. In the usual game, where the moves are limited to {1,2,3...m), this becomes 1 (mod m+1), i.e., if a player can take one to six objects in a turn, they want to leave the opponent with 1 (mod 7) objects. If a player finds themselves in such a position already, they cannot move to a winning position, and the usual strategy is to remove only one object in the hope that the more objects that remain, the more likely the opponent will make a mistake.

When she was a toddler, the game amused my daughter. It didn't take long for her to figure out the strategy however, and then it became boring. I came up with a variation that I called "Nim with holes". In this variation, instead of the moves being the positive integers with some upper limit, the moves can be any member of some set X of positive integers agreed upon by the players. So a game might start off with a pile of objects of size fifty, with "legal" moves being the removal of say n{2,4,7,9} objects. Or perhaps "Prime Nim" where the moves are limited to n∈{2,3,5,7,11}. The "holes" version of the game can be analyzed in a similar fashion to the original, the closed form strategy is left as an exercise for the reader.

Instead, let's look at solving the problem using a simple recursive program. Common in the functional programming style (and usually preferred over imperative/iterative methods for this style of programming), a recursive program (or routine or function) is one that references itself in its body.

For example, a simple function to calculate the factorial of a number n, usually denoted as n! and equal to the product of all positive integers less than or equal to n could be written in the iterative and functional styles (using c for this example) as:

Factorial function in C language, written in iterative style (left) and functional style (right).

As can be seen, the iterative version of the function "iterates", or steps incrementally, toward the goal, starting with a value of one for the variable fact and stepping through each integer and multiplying the value of fact by that integer, until the integer of n, the input value has been reached. The functional style, on the other hand, does away with the need for any intermediate variable. Instead, it "winds up" to the return value by taking the input value n and multiplying it by the value of the factorial function of n-1. In a recursive style of programing, the "stop" or "base case" must be defined to allow the "unwinding" of the recursion. In our example, we see that when the factorial function is passed a value less than or equal to one, it simply returns 1. This returned value is used in the multiplication by the caller, and so on back up the recursion chain until the final, initial caller returns the ultimate result.
The same factorial function, written using Scheme, would have the following form:
(define factorial (lambda (n) (if (= n 1) 1 (* n (factorial (- n 1))))))
Recursion can be a difficult concept to get one's head around initially, especially for non-programmers (and even some programmers!) An excellent introduction to the thought process can be found in Thinking Recursively by Eric Roberts. A deeper examination can be had in the landmark book by Douglas Hofstadter titled Godel, Escher, Bach: An Eternal Golden Braid. Anyone interested in recursion should have these books on their shelves. They are certainly must-haves for any self-respecting programmer.
We can see how solving our game of "Nim with holes" lends itself to a recursive solution by thinking about the possible plays going backwards, that is, starting with a losing position. Let's consider a game of "Nim with holes" where the players have agreed the legal moves must belong to the set {2,3,5}. We immediately see that a player finding themselves with one or two objects is in a losing position, and a player finding themselves with 3,4,5,6, or 7 objects can leave their opponent in that losing position. As we move away from the goal, reversing the possible game play steps, we would find that having eight or nine objects, or fifteen or sixteen are "losing positions", that is, the player faced with these cannot remove a number of objects that allows them to leave their opponent with a losing position, nor leave a position where the opponent cannot leave them the next losing position closer to the end game.
With a simple set of legal moves, this is easy to do in one's head. As the set of legal moves becomes more complex, the difficulty in determining winning and losing positions in the game becomes more difficult. Algorithmically it is trivial, and using a recursive and functional style of programming, elegant.
We want to build a simple function that can take the game state (the number of objects) and the set of legal moves as its arguments, and return "True" if we've won already, "False" if we're in a losing position (meaning we'll just move the minimum allowed), and the position we need to move to (if we can) that will put our opponent in a losing position (that we can then maintain for the rest of the game: once there, perfect play means the opponent is ultimately doomed.)
A trivial realization of this using Scheme is shown below.
"Nim with holes" move analysis function written in Scheme with example results (click to enlarge).
The winning position (winpos) function takes the game's current state (number of objects left) and a list of the legal moves. If we've already won (no objects left) #t (true) is returned. Otherwise, for each move in the legal move list (using the map function in Scheme), the function tests if any of the reachable positions are a losing move by testing each of them for the ability to reach a losing position, by testing each of them...until it winds recursively down to the base case: we have won. If the recursion is able to reach this state, when it is "unwound", the final result is the first position presently reachable that can lead us to the ultimate winning position. If the recursion results in not being able to reach the ultimate winning position, it returns #f (false) indicating we a presently in a losing position, and should take the minimum allowed move.

The operation of this recursion can perhaps be most simply imagined by first thinking about the game in what surely must be the simplest case: the only legal move is to take one object. Clearly, the winning positions alternate, with all odd numbers of objects being a losing position, and all even numbers being the winning positions. Pick some position, say five objects. The function tests for zero, which fails, resulting in the function calling itself for a value of four objects, which goes through the same sequence to call itself for three objects...until we arrive at the call to itself for zero objects.

This is our "base case", where the function returns #t (true). As this return is "unwound" or "percolated" back to the original first caller, note that it will have the logical not applied for each return for each nested call. This is where in our function the move positions are alternated between player and opponent: what is a winning position for a player is a losing position for their opponent and vice versa. In our example case, we  are in effect seeing the result of not(not(not(not(not(#t))))), resulting in #f (false), meaning our position of 5 objects is unfortunately a losing position in the case of the only legal move being removal of one object.
We can see in the lower part of the IDE the results from asking the function to evaluate a game state of 17 objects with legal moves of 2,3 or 5. The result indicates taking 2 objects and leaving 15 will put us in the winning position. The second example uses a simple map to map the winpos function onto a list of possible game states from 0 to 19 using 2,3 and 5 as the legal moves. The result is the list of results from the winpos function.
Readers can download the excellent (and free) Dr. Scheme environment from PLT Scheme should they want to experiment with Scheme, functional programming and recursion. The site has excellent tutorial materials and references for those interested in learning the details of this superb Lisp dialect.The trivial game solving function example (which can of course be extended to any similarly solvable game) can be found at NimWithHoles.scm.

Monday, May 17, 2010

Read This Or I'll Punch You: Media and Societal Hype vs Violent Video Games

"It was almost frightening, the reaction... from teenage boys"

"a threat to the moral well being"

"fosters almost totally negative and destructive reactions in young people"


"definite danger to the security of the United States"

Reactions to video games considered violent? No, these are quotes from things said about Elvis Presely, and "rock and roll" music only a few decades ago. How quaint, and how ridiculous the hype and overreaction seem to us now.

The same kinds of overreaction and hype have been seen from society at large and "experts" before: Television, violent movies, comic books, "professional" wrestling to name a few.

The current poster child of the extremists these days seems to be "violent" video games, that is, video games that graphically portray violence, gore, criminal behavior, or other material considered by some to be "provocative" or otherwise objectionable. Opinion and studies have attempted to link such games to actual aggression and addiction by the players.

In this blog entry, I'm going to argue that these opinions and "studies" do not hold water.

According to statistics reported by the Entertainment Software Association 68% of US households play video games, but only about a quarter of video game players are under  the age of 18: the majority of players are adults, which in fact represent the fastest growing demographic of players at around 35%  of U.S. households. In fact, the average age of players is 35, and the most frequent game purchaser is one 39 years old. An amazing fact to this author is that by 2009, 25% of video game players were over 50 years old - we might have to add a key for "walker" in addition to the sprint key...hardly children, I think you'll agree.

In addition, 92% of the players under 18 (children) report that their parents are present when the purchase or rental of video games are made, and nearly two-thirds of parents believe video games are a positive part of their children's lives.

And yet, studies such as "Effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, and prosocial behavior: A meta-analytic review of the scientific literature" by Anderson, "Aggression and psychopathology in adolescents with a preference for violent electronic games"  by Funk et al., and "Violent video games: The newest media violence hazard" by Gentile all claim to show links between the playing of violent video games and actual aggressive and violent behavior by the players.

Is there such a link? As in any kind of investigative science such as this, there will be differing views. I believe a careful examination of reports such as those listed and a comparison with those holding opposing views must lead one to the conclusion that reports linking video game violence to actual violence suffer from poor methodologies, ignoring negative results, failing to cite academic research with differing views, and improper use of meta-analytical techniques. Craig A. Anderson, author of one of the more comprehensive papers supporting the hypothesis, and a witness before the U.S. Senate on the issue, seems to be a particularly egregious offender toward scientific accuracy and journalistic integrity.

Major studies by groups such as The Harvard Medical School Center for Mental Health, The Journal of Adolescent Health, and The British Medical Journal have shown no significant link between video games and actual player violence. As is made clear in these studies, the old adage "correlation does not imply causation" holds true here, as it should in any honest scientific study. It appears to the author that most if not all of the authors of reports supporting the hypothesis have fallen into the logical fallacy of Post hoc ergo propter hoc: "After this, therefore because of this". The mistake of this is in coming to a conclusion like these authors without considering confounding factors that could rule out any actual connection or causality. Correlations prove very little, if anything, other than a correlation exists. Is it more likely that a violent game begets violence, or that violent children prefer violent games?

Constructing a scientifically valid test of the hypothesis that violent video game play begets actual violence may be nearly impossible: The very definition of "aggression" is difficult to measure objectively. Some studies done with college students in an attempt to measure aggression objectively by allowing players to blast their "opponents" with noise bursts as "punishment" are flawed for example because of the very fact that  it is viewed by the participants as a game. There is no real "punishment", the noise bursts cannot cause any real harm. There is little if any remorse in punishing your opponent if they punish you.

In addition, an accurate definition of "violence" in video games themselves is hard to come by. Is the Madden football series really violent, as some studies state, considering the societal acceptance of the real game of football, where nearly 23 deaths per year are reported on average. Is the children's game Kirby violent? After all, the protagonist swallows their enemies whole, absorbing all of their powers.

I agree wholeheartedly with Nathaniel Edwards in his blogcritics posting: "No matter what a study's results show, the media can be counted on to warp it enough to make it interesting. Typically, this means that headlines claim a greater link between violent media and aggression. There are few details in the actual news stories, and instead there are lots of sweeping claims which don't allow the reader to interpret anything."

There can be no doubt that tragedies such as The Columbine High School massacre and The Virginia Tech massacre are calamities beyond the imaginations of most of us. Expressions of condolences to the victims and their families and friends seem hollow, so horrible were the events. Nonetheless, the immediate links to violent video games made by the media and "experts" were scientifically unfounded.

A particularly bright light of reason and scientific integrity is Christopher J. Ferguson of the Behavioral, Applied Sciences and Criminal Justice department at Texas A&M International University. Professor Ferguson has done extensive research on violent behavior, and published landmark papers specifically covering the aspect from a video game play standpoint, along with various lay articles on the subject.

In his paper titled "The School Shooting/Violent Video Game Link: Causal Relationship or Moral Panic?", Ferguson states "Some scholars have attempted to draw links between laboratory and correlational research on video game playing and school shooting incidents. This paper argues that such claims are faulty and fail to acknowledge the significant methodological and constructional divides between existing video game research and acts of serious aggression and violence. It is concluded that no significant relationship between violent video game exposure and school shooting incidents has been demonstrated in the existing scientific literature, and that data from real world violence call such a link into question." 

In the paper shows how the conclusions reached by the research he studied that support the hypothesis of violent video gaming being causative of real violence are faulty. A most interesting fact revealed in the paper is that while video game play has grown explosively among children and adults, violent crime over the same period has decreased significantly, both for police arrest data and crime victimization data in the U.S., with similar results found in studies from Canada, Australia, the European Union, and the United Kingdom. The interested reader is referred to the link for the paper for details of Ferguson's study.

A more approachable article for the lay reader was published by Ferguson in the September/October 2009 issue of Skeptical Enquirer titled "Violent Video Games: Dogma, Fear, and PseudoscienceIn this article, Ferguson reviews and distills research on both sides of the argument, reaching the same conclusion as found in his academic research and publications: There is no proven link between the playing of violent video games and actual violence by the players of such games, and current research and studies that claim otherwise seem to suffer from severe methodological and other issues that compromise their scientific integrity and usefulness.

In particular, Ferguson shows how these studies typically suffer from severe "citation bias", that is, a failure to honestly report on research that do not support the author's hypothesis. Even more concerning are papers, such as the aforementioned Anderson study, where the authors appear to ignore their own results in order to forward their a priori hypothesis. In particular, Anderson used four measures of aggression in his laboratory studies, and only found a very weak significance for one of them. Had proper statistical techniques been employed by the author, even this weak link would have been shown to be statistically insignificant. Nonetheless, the author chose to ignore the results that did not support his hypothesis, and published a paper based on the single, intrinsically flawed, result set.

Ferguson sums this kind of behavior, unfortunately typical on the side of the argument supporting the hypothesis with "I believe that these authors have ceased functioning as scientists and have begun functioning as activists, becoming their own pressure group. Indeed, in at least one article, the authors appear to actively advocate censorship of academic critics and to encourage psychologists to make more extreme statements in the popular press about violent video-game effects"

With respect to the links made between the tragic school shootings and the fact that the perpetrators played violent video games, Ferguson retorts "It is certainly true that most (although not all) school shooters played violent video games. So do most other boys and young men. Concluding that a school shooter likely played violent video games may seem prescient, but it is not. It is about as predictive as suggesting that they have probably worn sneakers at some time in the past, are able to grow facial hair, have testicles, or anything else that is fairly ubiquitous among males."

That legal nut cases such as Jack Thompson, a particularly troublesome and misinformed activist, have been disbarred for their antics gives a glimmer of hope that sanity will prevail. Just as Elvis is the Devil chants that were the norm now seem ludicrous, it appears the media and science are coming to their senses with respect to video games and violence. A recent paper in the medical journal The Lancet suggests "the time may have come to move beyond professional research focusing on media violence, generally, as a cause of serious aggression."

U.S. courts have blocked laws that attempt to outlaw violent video games, and most recently the U.S. Supreme court justices have agreed to review a California law that attempted to restrict the sales of video games.

We may finally get the legal protection we deserve to allow us to purchase and play the games we chose. And as more researchers like Professor Ferguson present the facts backed with proper research and scientific integrity, the spectre that the media has portrayed regarding violent real life behavior and video games will fade into oblivion.

Doing the Jitterbug: Lag, Latency, Jitter and Throughput and How They Affect Online Gaming (Part I)

Consistency: In an ideal online online gaming world, every player in the game would see the same consistent game world, and every client of a game server would have access to exactly the same game state information at any given instant. No player would suffer from seeing actions in the game world at a later time than other players, or losing a shooting duel because the opponent has a faster, lower lower latency connection to the game server. In the real world, this is of course impossible to achieve with existing technology.

Perhaps the greatest impediment to reaching this goal is caused by the vagaries of the underlying infrastructure that provides the communication between the game clients and servers: the Internet. The path to the goal of perfection is washed away by two primary effects in game communication over the Internet:
  • Throughput limitations - the connection between server and client has some maximum throughput.
  • Network delays - even with unlimited throughput, server messages do not arrive instantly.
The first, throughput limitations (bandwidth), means that the game state transmitted to game clients is at best a sample, or approximation, of the actual complete game world. The bandwidth and processing power simply do not exist in typical environments to allow all of the information required to completely represent the game state to be communicated. Just as the typical music CD has less information than the master recording from which it is made due to the sampling of the audio signal to match the constraints of the CD audio format, so too do the messages that the game clients and servers interchange. By careful crafting of the structure of the messages used, game developers can provide an approximation that is sufficient for realistic game play, so long as the minimum throughput requirements of the game are met.

The second, network delays, means that regardless of the available bandwidth, there is always a delay between the servers sending a message and the client receiving it and vice versa. In effect, the information in the message is time-shifted to a later wall-clock time: an event may happen at actual time X, but by the time the client receives the message of the event, the actual time is perhaps X+2.  When the possibility of time-shifting is present, the possibility of causality violations rears its head: events may seem to occur in the wrong order for game clients. For example, a player may open a door, but that door isn't there in the game world because it was destroyed by a grenade (and seen as such by another online player), an event that the client component of the first player was not yet aware of due to the delays in message arrival. Other players might see the first player's avatar "opening" a door that isn't even there.

We will review the details of these problems, the effect they have for players of online games, and the techniques commonly used to minimize the unfairness between players that can result if these problems are left untreated. We will cover the definitions and details of the problems in this blog entry, with part II to cover the effects these have on players and the techniques used in games to minimize their impact.

Throughput and Latency: 

The throughput requirements for online PC games vary widely, but in general are far below the available bandwidth of the typical client (we will only be discussing the client side and impact, obviously, running a server for a game dictates a much higher aggregate bandwidth requirement). Recent studies (Feng 2002 & 2005) using games such as Counter-Strike, Day of Defeat, Medal of Honor: Allied Assault, and Unreal Tournament 2003 showed a client load ranging from 6400bps to 8000bps for client to server packets and 20800bps to 36000bps for server to client communications. These are far below even lower-tired ISP services typically used by online gamers.

Congestion of the network may cause throughput to drop to a level that is insufficient for smooth game play. Congestion typically occurs in one of three primary areas: the last mile near the user, the middle or Internet cloud, and the last mile on the server side.

In the case of the user-side congestion, they may simply have a service tier that does not provide sufficient bandwidth. This can of course be remedied with a service upgrade. At a minimum, a service with 600kbps down and 50kbps up should suffice. The faster down link speed while not strictly required to play online will ensure faster downloads of game items such as server hosted maps.

The gamer should also ensure that other activities on their local network are not causing congestion. Other users gaming, streaming audio or video, using services such as torrents, etc. can all adversely affect the overall available broadband bandwidth for the player.

Problems in the last mile on the server side can be caused by too many players joining a specific game server, causing a bottleneck on the network link to that server. Game servers typically employ a player count limit to avoid this occurrence. Any other congestion in this link of the game communication network (router congestion or failure modes, etc.) is likely to be out of the control of both the player and their server provider.

Congestion in the Internet cloud is usually temporal: Perhaps a widely viewed sporting event is viewed by large numbers via streaming technologies. As with most last mile issues on the server side, these are out of the control of the server provider and game player. In cases where Internet cloud congestion is the cause of game play issues, the only remedy is to wait until the problem "goes away".

Any kind of congestion, whatever the cause, can cause throughput degradation that may adversely affect the consistency of game play. If the game client is starved of message packets due to actual throughput issues or congestion related throughput issues, the synchronization between the client and server will be lost, resulting in "laggy" game play, "rubber-banding", and other temporal effects. Severe throughput problems can result in the game client "giving up" and disconnecting from the game server.

There is no accepted and commonly agreed upon definition for latency (Delaney 2006). The latency of a network is commonly measured using the ping command. This however measures not the one-way trip from client to server or vice versa, but instead measures the round-trip time. Since the routes from client to server and server to client are usually asymmetric, simply guessing at half the value arrived at from a ping measurement may be grossly inaccurate, and provide incorrect information for making client and server timing decisions. In addition, such a measurement does not account for processing and other delays at the client and server endpoints.

A more useful measurement is the endpoint-to-endpoint measurement of latency that accounts for time needed for client-side processing, bi-directional network delay, and server-side processing (Stead 2008).
This is important: It has been found in studies that much of the delay in the overall game processing loop is caused by the game client handling and processing of messages.

The sources of network delay fall into four basic categories (Kurose 2009):
  • Transmission delay: Packet time to physical layer.
  • Queuing delay: Packet time waiting to be sent to a link.
  • Processing delay: Packet time spent at routers along the route.
  • Propagation delay: Packet time in physical link (bounded by the speed of light).
Transmission delay occurs during the movement of the packet to a physical link. For example, if you are using a 1Mbps WAN connection, each bit takes 1 µs to send, and a 500 byte packet takes 0.5 ms.

Queuing delay can occur at routers along the path of the packets. If a router is is under heavy utilization or the required outbound link is busy, the packet will be queued into a buffer until it can be sent.

Processing delay is also incurred at routers, since these must handle routing table checks, possible firewall rule application, packet check sum and error checking.

Lastly, even if  delays in packet transmission due to processing overhead , transmission delays and queuing delays could be eliminated, we are still bound by the laws of physics. No signal can travel faster than light (2.998x10^8 m/s in vacuo). Speeds in actual transmission media will be lower (e.g. 1.949x10^8 m/s in typical optical fiber, significantly lower for twisted-pair copper). This means we are bounded by an absolute minimum round-trip latency of roughly 2 ms client endpoint to server endpoint and back for a client to server distance of 200 km.


Jitter is the variation in network latency caused by changes in the state of the network. Packets that comprise the communication between the game client and server seldom follow the exact same route endpoint to endpoint. This can cause packets to have different latencies. In addition, network congestion can result in changes in the routing and router buffering behavior, changing the queuing delays for the affected routers.

We can visualize this effect with the aid of a diagram.

In this diagram, packets are sent from the server represented by the lower solid line at regular intervals (time ticks) to the client represented by the upper solid line. If we were able to construct a network with none of the four causes of latency outlined, and in addition discovered a way to violate the laws of physics and send our packets with infinite speed, the green line results: there is no latency between the server sending a packet and the client receiving it.

The more realistic example is represented by the blue line, which shows the slight delay the packet experiences traversing the network from the server to the client. The orange line depicts the next packet in the sequence, which is delayed by the same amount as the packet of the blue line. In the ideal world, the latency from the server to client and vice versa would exhibit this constancy. This would simplify any "compensation" for latency the game developers might wish to utilize, and even without compensation, humans tend to have an easier time adapting to latency in a game when it is relatively constant, even when the latency is rather large (Claypool 2006).

More typically, the game packets experience changes in latency from routing and congestion problems. This is illustrated with the final train of three packets colored red, magenta, and dark brick red. For these packets, it is clear any semblance of packet arrival at relatively regular time ticks is completely lost. There is currently no standard measure for jitter in game traffic. Jitter in networks tends to exhibit randomness, but can be characterized by a Gaussian distribution for inter-packet arrival times (Perkins 2003). Since we are bounded by conditions such as some minimal amounts of processing, queuing, and transmission delay in addition to the absolute bound due to the propagation delay, the actual distribution is biased: there is some absolute minimum that can be realized, and network congestion and related issues can cause delays to be skewed. This is illustrated in the following graph.

Graph of Gaussian (Red) and skewed/biased distributions (Blue) for inter-packet arrival times.

The fit is sufficient that we can use this model for predicting the likelihood of specific inter-packet times for use in the design of compensatory mechanisms for games.

In part II of Doing the Jitterbug, we will investigate what effects these issues have on game play, and what techniques can be used to minimize these effects.

Interested readers can find references for further study after the jump break.

Sunday, May 16, 2010

See Your Way Out - Using mathematics and image processing to quickly pathfind

As I outlined in my blog entry Earning Your Pathfinding Merit Badge: How Game Characters Navigate Their Game World, pathfinding (or path finding) is a critical component in PC games.

Pathfinding refers to the techniques used in games to allow Non-Player Characters (NPC) to navigate the game world.

Most games use some variant of Dijkstra's algorithm, a technique pioneered by Dutch computer scientist Edsger Dijkstra. Dijkstra's algorithm produces a shortest path tree for its result by solving the shortest path problem for a graph with non-negative edge length (the distance between the vertices of the graph are never negative).  It is one of a myriad of graph search (or graph traversal) algorithms.

Dijkstra's algorithm is often used in routing problems, such as network routing protocols, most notably IS-IS and OSPF (Open Shortest Path First), and transportation map shortest path solutions.

Perhaps the most commonly used variant in games is the A* search algorithm. Instead of using the distance from the start node (vertex) of the graph, this variant chooses nodes based on an estimate of the distance from the start node to the destination node. This estimate is formed by adding the known distance to the start node to a guess of the distance to the destination node. This guess is called the heuristic of the algorithm.

By utilizing such a technique, the A* algorithm provides improved performance when compare to the behavior of Dijkstra's algorithm. If the guess is set to zero, the two algorithms are equivalent. When the guess is made positive but less than the true distance to the goal, A* continues to find optimal paths, but because fewer nodes are examined, performance is improved. When the guess exactly matches the actual distance, A* finds the optimal path and does so while examining the minimal possible number of nodes. If the guess is increased yet more, A* continues to find paths, examining fewer nodes still, but no longer guaranteeing an optimal path. In most games, since it is not practical to guarantee a correct heuristic at all times, this is acceptable due to the increase in algorithm performance.

With extremely complex environments (like the maze we'll be looking at), these algorithms can prove to be impractical, with other solutions preferred. A nice overview of such issues can be seen at Sean Barrett's Path Finding blog entry.

I'm going to go through a technique that is conceptually beautiful to me: Using image processing to solve a maze path finding problem.

I'm sure every reader is familiar with mazes. Perhaps some of you have found yourself lost in a particularly fiendish one, having to ring the nearby "rescue bell'!

I'll be using the superb Matlab system from Mathworks.

From their web site: "MATLAB® is a high-level language and interactive environment that enables you to perform computationally intensive tasks faster than with traditional programming languages such as C, C++, and Fortran." 

It most certainly is! Along with the equally superb Mathematica it is the staple of my tool set for doing complex mathematical programming, analysis, graphing, and data manipulation. These two products would be my desert island choices to take with me if I could only have two programming environments.

I'll be using the Matlab image processing package DIPimage, developed by Cris Luengo, Professor in multidimensional image analysis at the Swedish University of Agricultural Sciences and Uppsala University. I've borrowed copiously from his examples. Users of Matlab can obtain a license for DIPimage free of charge (normally a $3200.00 and up cost) for academic, non-commercial use. I highly recommend it!

Let's begin.

We will start with the following simple (no holes, loops, etc.) maze:

Note the entry and exit apertures in the lower right and upper right areas of the outer call of the maze. Try solving it manually if you'd like!

Next, we use our image processing functions to convert our maze to grey-valued, and threshold it to obtain the binary image. The maze walls are now shown in red, with the path areas in black:

maze = colorspace(inputmaze,'grey');
maze = ~threshold(maze)

We use image processing functions to "label" any found walls and show them with differing colors. In this case, we clearly see the maze comprises two walls:

maze = label(maze);

From this we can easily see the path is the area between the two maze walls. By applying a little mathematical morphology, we can isolate one of the walls:

path = maze==1

We then can do a binary dilation on the wall and fill in any holes (effectively a smearing of the wall):

pathwidth = 5;
path = bdilation(path,pathwidth);
path = fillholes(path)

Next, let us erode this smeared version of the wall. This operation effectively "shrinks" the wall by subtracting components smaller than the erosion radius and in the case of a binary image such as ours by removing the perimeter pixels. We then take the difference of the eroded image and the original image:
path = path - berosion(path,pathwidth);

Finally, overlaying the result with our original maze shows our desired goal:


I think you'll agree, a most fascinating use of tools to solve a problem from a domain you'd not likely think to be image processing.

The interested reader can find details of techniques such as these in these excellent references:
The Image Processing Handbook by John C. Russ
Digital Image Processing Using MATLAB by Rafael C. Gonzalez
Feature Extraction & Image Processing by Mark Nixon
Happy trails to you!

Saturday, May 15, 2010

The Alpha and The Omega of "Search" Engines?: WolframAlpha

You might have noticed I've embedded WolframAlpha into the heading area of the blog. You might not know what it is.

Stephen Wolfram is one of the brightest people you could ever meet. Most certainly a genius. Educated at Eton, and later attending the University of Oxford he had by the age of eighteen already written cited papers in particle physics and quark production. He received his PhD in particle physics at the California Institute of Technology, where he became a faculty member. There he pursued research in computer science, cellular automata, physics, and cosmology, earning him one of the first awards of the MacArthur Fellows Program, commonly known as the "Genius Award".

While at Caltech, Wolfram worked on the implementation of the computer algebra system SMP, essentially version one of what later became the widely used standard for technical computing, Mathematica. Irascible, opinionated, brash and forceful, Wolfram is not one to shy away from imposing his world views on others. His recent years of research led to the publication of A New Kind Of Science, outlined in my blog entry Life Goes On: Further Ruminations on Conway's Game of Life.

WolframAlpha is Stephen's attempt to provide a generalized "computational knowledge engine". Based on the prodigious capabilities of Mathematica, WolframAlpha simultaneously provides a hyper-capable calculating environment and the ability to query for data, either for the sake of the returned data or for use in calculations. The data used is "curated" by Wolfram Research, providing for a much more reliable level of quality compared to existing search engines.

This is no Google. Asking some generic question is likely to be met with the WolframAlpha equivalent of "Huh?":

Ask the engine something more precise, such as "United States population growth", and the character of the system begins to emerge:

Combine this with calculation capabilities second only to actually purchasing a copy of Mathematica to run locally, and you have truly phenomenal capacity for data gathering, analysis, and calculation at your fingertips.

Here, WolframAlpha shows us the integral of a complex equation integral (sin(theta)/cos(theta)^2^pi):

Give it a try, and visit the site at WolframAlpha for more details.

To give it a whirl right now, try typing "ISS now" (without the quotes) into the WolframAlpha box at the top of the blog entry, and pressing enter  or clicking on the equals sign. Be prepared to be impressed!

Bar Bets and Thermonuclear Bombs

Stanislaw Ulam, along with Edward Teller, fathered the thermonuclear bomb or Hydrogen Bomb as it is more commonly known. Ulam is one of the many great mathematical minds of Polish descent. Known as a rather cantankerous fellow, he made major contributions in the fields of Set Theory, Topology, Ergodic Theory, Biology and Physics. A true polymath.

While residing in the United States during World War II, Ulam was invited to join a secret project in New Mexico by his close friend John von Neumann, one of the greatest mathematicians in history and an epic contributor to the the fields of mathematics, economics, set theory, game theory, logic, quantum mechanics, computer science and nuclear physics. Neumann created the field of Cellular Automata (see Life Goes On: Further Ruminations on Conway's Game of Life for examples) with a pencil and graph paper as tools to build the first self-replicating automata.

With his typical scientific curiosity, Ulam checked out a book on New Mexico from the University of Wisconsin–Madison library, the campus where he was a faculty member at the time. In it, he found the names of all those that had previously disappeared from the campus on the check-out card. He subsequently joined the  Manhattan Project at Los Alamos, where the development of the first atomic bomb was progressing.

As the atomic bomb project was nearing fruition, Teller had turned his attention to the creation of a vastly more powerful tool of destruction, the Hydrogen Bomb, or "Super" as it came to be known. Ulam, made aware of the "super", showed that Teller's design was faulty and devised a much more suitable design. While the design took on the moniker of "the Teller-Ulam design", most involved in the project credit Ulam with being the true father of modern thermonuclear devices, as contrasted to Teller's rumored contribution as the character model for Dr. Strangelove in the eponymous film by Stanley Kubrick.

A particularly interesting contribution of Ulam's to mathematics and topology specifically is the Borsuk-Ulam theorem, first conjectured by Ulam and later proved by Karol Borsuk in 1933. The theorem is stated as follows:

Call a continuous map f:Sm→  Sn antipode preserving if f(x)=f(x) for all x∈  Sm
Theorem: There exists no continuous map f:Sn Sn1 which is antipode preserving for n > 0
In English, this states that any continuous function from an n-sphere to Euclidean n-space maps some pair of antipodal points to the same point.

In even more plain English, I'll try to explain why this results in things that can sound incredulous to most non-mathematicians, and why it provides for the opportunity of a "bar bet" that at the very least might get you a free beer.

Take the circle above with the two labeled points A and B. This will be the only drawing here, to start us off. I'm no artist, and my meager attempts at drawings for the steps would likely confuse rather than help. Get out your pencil and paper, put on your thinking caps, and follow along!

Imagine this is a line drawn around a sphere, such as the idealized Earth. The equator is an example of such a line. In this case, we only require that it is maximal in size, that is, it is drawn around the Earth in a way that it is as large as it can be, like the equator or any of the lines of longitude that run north to south on the globe.

You've perhaps heard the term "Great Circle" used for navigation. A Great Circle is a circle around the earth that is maximal. Any segment of such a circle that connects two points is the shortest distance between those points on the globe. So taking the great circle path is always the shortest route between points on the surface of the Earth.

A shape such as the circle we are using is known mathematically (at least by topologists) as a 1-sphere or Disc: it is a one dimensional surface embedded in two dimensional Euclidean space. What non-mathematicians call a sphere is known as a 2-sphere or ball: A two dimensional surface embedded in three dimensional Euclidean space.

We want to show that we can always find two points, opposite (antipodal) to each other on the planet, that have exactly the same temperature. If we find that points A and B in fact measure the same temperature for the circle we've drawn, we have made our point. But suppose the temperatures are different.

Let's call A the colder measurement point, and B the hotter measurement point. Now imagine moving the measuring points, keeping them diametrically opposed, around the circle until they have exchanged places. Let us assume that the measurements never share the same temperature. Because of this assumption, measurement point A must remain colder than measurement point B for the whole trip, and measurement point B must remain hotter than measurement point A.

But after rotating the measurement points 180°, we know that measurement point B must be measuring a colder temperature than measurement point A and vice versa. This contradicts our assumption, and by the mathematical technique of proof by contradiction we now know that there must have been some point along the path where measurement points A and B showed the same temperature.

The same proof technique can be extended to any 2-sphere/ball like the Earth. When moving to the higher dimensions, we are allowed more degrees of freedom in our chosen path for the antipodal points. We can choose any arbitrary meandering path, so long as the two measuring points remain antipodal.

Imagine we choose some path, again resulting in the exchanging of the positions of the two measurement points. By the same methods, we can see that again, there must be two antipodal points somewhere on that path that share the same measurement. This is critical to our emerging bar bet. This means that every path taken by measurement point A on it's way to measurement point B, with measurement point B following along by remaining diametrically opposed (antipodal) on the opposite site of the Earth must have a point where the two points shared the same temperature.

Let's consider the set of all possible pairs of antipodal points on the Earth with the same temperature. These points need not have the same temperatures, they must only have the same temperature as their matching antipodal point on the opposite side if the Earth to be a member of this special set.

You may be thinking that the result might be some myriad of dots scattered across the surface of the ball where the matching antipodal dot shares the same temperature. But this would be wrong!

If this was the case, we could always find some path from the original measurement point A to measurement point B without needing to touch any of the dots in the set. But as we have already shown, this set must contain a pair of the dots opposite each other on the Earth that share the same temperature along whatever path we take from A to B.

From this, we can make the conclusion that the set of dots must contain some set of dots that result in a continuous path dividing our ball (the Earth) into two sections, one containing the original measurement point A and the other containing measurement point B.

Now imagine applying the same line of thinking to this path, except we'll use some other continuous function such as barometric pressure. We pick two antipodal measurement points, call them C and D, and take our measurement. If the result is the same, we are finished showing the same fact that we proved for temperature holds true for barometric pressure. If they differ, we again let measurement point C follow the path to measurement point D, with point D following along on the opposite side of the globe. As with temperature, we will find some point along the path that has matching measurements for measurement points C and D.

But as we know, this point is on the path that contains all of the points where the temperature measurements are equal for the matching antipodal points of the path.

This means that that point, and its matching antipodal point, simultaneously have the same temperature and barometric pressure! Such points can always be shown to exist on a 2-sphere/ball, in our case the Earth, for any two continuous functions on its surface. We could have just as well used temperature and wind speed, wind speed and barometric pressure, etc.

The Borsuk-Ulman theorem proves that this is true for any pair of continuous functions on any 2-sphere! As can be seen in the theorem, it can be extended to any number of dimensions, with a corresponding increase in the number of continuous functions that will share a pair of antipodal points with equivalent values for each function.

This means for example that when that big plump helium-filled party balloon diffuses to death and is lying on the floor in a crumpled mass, there is a pair of points that were antipodal when the balloon was inflated that are now lying precisely on top of each other. Always.

So the next time you find yourself at the local watering hole with your bright companions, bet them a beer that there are two places on the opposite sides of the Earth, right now, that have exactly the same temperature and barometric pressure.

The bet is almost surely to be taken, and you'll earn yourself a free beer. Or maybe have one tossed at you!

While you might curse Ulam for his contributions to nuclear weaponry, remember to raise a toast to him before drinking your free beer! 

For more uses on this most interesting theorem, the book Using the Borsuk-Ulam Theorem: Lectures on Topological Methods in Combinatorics and Geometry by Jiri Matousek and published by Springer provides a rewarding overview in an approachable text at the elementary level.

Excellent introductory topology books can be had in Topology by K. Jänich (also from Springer) and the classic Topology by James R. Munkres, published by Prentice Hall. 

Topology and Its Applications by William F. Basener published by Wiley provides more examples of useful results from general topology.

The Topology entry at Wolfram Mathworld provides a one page description of the field with copious references. You may find yourself wandering the site following one interesting link after another.