Wolfram|Alpha: Systematic knowledge, immediately computable.

Wednesday, May 19, 2010

Nim Chimpsky Plays A Game: A Simple Game Played With Simple Recursion

Nim Chimpsky, named as a pun of Noam Chomsky the brilliant linguist, cognitive scientist and philosopher, was a chimpanzee that researcher Herbert S. Terrace of Columbia University claimed had learned a form of sign language and was able to communicate intelligently. Whether or not this was in fact true is still hotly debated in the research community.

If one believes that Nim had the intelligence to learn and use language, one might think the chimp could play one of the simplest of children's games, also called Nim. The game of Nim is a game of extensive form and perfect information that comes in many flavors. We will be using perhaps the simplest form, often called the subtraction game. This version consists of  two players starting with a number (usually represented as a pile of objects like coins or pebbles), where each player in turn removes one, two, or three objects. The player to take the last object is the loser.

It can be trivially shown that such a game reduces to a player avoiding being left with n 1 (mod 4) objects, in other words, a player should try to remove enough objects to leave the opponent with 1 (mod 4) objects. In the usual game, where the moves are limited to {1,2,3...m), this becomes 1 (mod m+1), i.e., if a player can take one to six objects in a turn, they want to leave the opponent with 1 (mod 7) objects. If a player finds themselves in such a position already, they cannot move to a winning position, and the usual strategy is to remove only one object in the hope that the more objects that remain, the more likely the opponent will make a mistake.

When she was a toddler, the game amused my daughter. It didn't take long for her to figure out the strategy however, and then it became boring. I came up with a variation that I called "Nim with holes". In this variation, instead of the moves being the positive integers with some upper limit, the moves can be any member of some set X of positive integers agreed upon by the players. So a game might start off with a pile of objects of size fifty, with "legal" moves being the removal of say n{2,4,7,9} objects. Or perhaps "Prime Nim" where the moves are limited to n∈{2,3,5,7,11}. The "holes" version of the game can be analyzed in a similar fashion to the original, the closed form strategy is left as an exercise for the reader.

Instead, let's look at solving the problem using a simple recursive program. Common in the functional programming style (and usually preferred over imperative/iterative methods for this style of programming), a recursive program (or routine or function) is one that references itself in its body.

For example, a simple function to calculate the factorial of a number n, usually denoted as n! and equal to the product of all positive integers less than or equal to n could be written in the iterative and functional styles (using c for this example) as:


Factorial function in C language, written in iterative style (left) and functional style (right).

As can be seen, the iterative version of the function "iterates", or steps incrementally, toward the goal, starting with a value of one for the variable fact and stepping through each integer and multiplying the value of fact by that integer, until the integer of n, the input value has been reached. The functional style, on the other hand, does away with the need for any intermediate variable. Instead, it "winds up" to the return value by taking the input value n and multiplying it by the value of the factorial function of n-1. In a recursive style of programing, the "stop" or "base case" must be defined to allow the "unwinding" of the recursion. In our example, we see that when the factorial function is passed a value less than or equal to one, it simply returns 1. This returned value is used in the multiplication by the caller, and so on back up the recursion chain until the final, initial caller returns the ultimate result.
 
The same factorial function, written using Scheme, would have the following form:
 
(define factorial (lambda (n) (if (= n 1) 1 (* n (factorial (- n 1))))))
 
Recursion can be a difficult concept to get one's head around initially, especially for non-programmers (and even some programmers!) An excellent introduction to the thought process can be found in Thinking Recursively by Eric Roberts. A deeper examination can be had in the landmark book by Douglas Hofstadter titled Godel, Escher, Bach: An Eternal Golden Braid. Anyone interested in recursion should have these books on their shelves. They are certainly must-haves for any self-respecting programmer.
 
We can see how solving our game of "Nim with holes" lends itself to a recursive solution by thinking about the possible plays going backwards, that is, starting with a losing position. Let's consider a game of "Nim with holes" where the players have agreed the legal moves must belong to the set {2,3,5}. We immediately see that a player finding themselves with one or two objects is in a losing position, and a player finding themselves with 3,4,5,6, or 7 objects can leave their opponent in that losing position. As we move away from the goal, reversing the possible game play steps, we would find that having eight or nine objects, or fifteen or sixteen are "losing positions", that is, the player faced with these cannot remove a number of objects that allows them to leave their opponent with a losing position, nor leave a position where the opponent cannot leave them the next losing position closer to the end game.
 
With a simple set of legal moves, this is easy to do in one's head. As the set of legal moves becomes more complex, the difficulty in determining winning and losing positions in the game becomes more difficult. Algorithmically it is trivial, and using a recursive and functional style of programming, elegant.
 
We want to build a simple function that can take the game state (the number of objects) and the set of legal moves as its arguments, and return "True" if we've won already, "False" if we're in a losing position (meaning we'll just move the minimum allowed), and the position we need to move to (if we can) that will put our opponent in a losing position (that we can then maintain for the rest of the game: once there, perfect play means the opponent is ultimately doomed.)
 
A trivial realization of this using Scheme is shown below.
 
"Nim with holes" move analysis function written in Scheme with example results (click to enlarge).
 
The winning position (winpos) function takes the game's current state (number of objects left) and a list of the legal moves. If we've already won (no objects left) #t (true) is returned. Otherwise, for each move in the legal move list (using the map function in Scheme), the function tests if any of the reachable positions are a losing move by testing each of them for the ability to reach a losing position, by testing each of them...until it winds recursively down to the base case: we have won. If the recursion is able to reach this state, when it is "unwound", the final result is the first position presently reachable that can lead us to the ultimate winning position. If the recursion results in not being able to reach the ultimate winning position, it returns #f (false) indicating we a presently in a losing position, and should take the minimum allowed move.

The operation of this recursion can perhaps be most simply imagined by first thinking about the game in what surely must be the simplest case: the only legal move is to take one object. Clearly, the winning positions alternate, with all odd numbers of objects being a losing position, and all even numbers being the winning positions. Pick some position, say five objects. The function tests for zero, which fails, resulting in the function calling itself for a value of four objects, which goes through the same sequence to call itself for three objects...until we arrive at the call to itself for zero objects.

This is our "base case", where the function returns #t (true). As this return is "unwound" or "percolated" back to the original first caller, note that it will have the logical not applied for each return for each nested call. This is where in our function the move positions are alternated between player and opponent: what is a winning position for a player is a losing position for their opponent and vice versa. In our example case, we  are in effect seeing the result of not(not(not(not(not(#t))))), resulting in #f (false), meaning our position of 5 objects is unfortunately a losing position in the case of the only legal move being removal of one object.
 
We can see in the lower part of the IDE the results from asking the function to evaluate a game state of 17 objects with legal moves of 2,3 or 5. The result indicates taking 2 objects and leaving 15 will put us in the winning position. The second example uses a simple map to map the winpos function onto a list of possible game states from 0 to 19 using 2,3 and 5 as the legal moves. The result is the list of results from the winpos function.
 
Readers can download the excellent (and free) Dr. Scheme environment from PLT Scheme should they want to experiment with Scheme, functional programming and recursion. The site has excellent tutorial materials and references for those interested in learning the details of this superb Lisp dialect.The trivial game solving function example (which can of course be extended to any similarly solvable game) can be found at NimWithHoles.scm.

Monday, May 17, 2010

Read This Or I'll Punch You: Media and Societal Hype vs Violent Video Games

"It was almost frightening, the reaction... from teenage boys"

"a threat to the moral well being"

"fosters almost totally negative and destructive reactions in young people"

"deplorable"

"definite danger to the security of the United States"

Reactions to video games considered violent? No, these are quotes from things said about Elvis Presely, and "rock and roll" music only a few decades ago. How quaint, and how ridiculous the hype and overreaction seem to us now.

The same kinds of overreaction and hype have been seen from society at large and "experts" before: Television, violent movies, comic books, "professional" wrestling to name a few.

The current poster child of the extremists these days seems to be "violent" video games, that is, video games that graphically portray violence, gore, criminal behavior, or other material considered by some to be "provocative" or otherwise objectionable. Opinion and studies have attempted to link such games to actual aggression and addiction by the players.

In this blog entry, I'm going to argue that these opinions and "studies" do not hold water.

According to statistics reported by the Entertainment Software Association 68% of US households play video games, but only about a quarter of video game players are under  the age of 18: the majority of players are adults, which in fact represent the fastest growing demographic of players at around 35%  of U.S. households. In fact, the average age of players is 35, and the most frequent game purchaser is one 39 years old. An amazing fact to this author is that by 2009, 25% of video game players were over 50 years old - we might have to add a key for "walker" in addition to the sprint key...hardly children, I think you'll agree.

In addition, 92% of the players under 18 (children) report that their parents are present when the purchase or rental of video games are made, and nearly two-thirds of parents believe video games are a positive part of their children's lives.

And yet, studies such as "Effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, and prosocial behavior: A meta-analytic review of the scientific literature" by Anderson, "Aggression and psychopathology in adolescents with a preference for violent electronic games"  by Funk et al., and "Violent video games: The newest media violence hazard" by Gentile all claim to show links between the playing of violent video games and actual aggressive and violent behavior by the players.

Is there such a link? As in any kind of investigative science such as this, there will be differing views. I believe a careful examination of reports such as those listed and a comparison with those holding opposing views must lead one to the conclusion that reports linking video game violence to actual violence suffer from poor methodologies, ignoring negative results, failing to cite academic research with differing views, and improper use of meta-analytical techniques. Craig A. Anderson, author of one of the more comprehensive papers supporting the hypothesis, and a witness before the U.S. Senate on the issue, seems to be a particularly egregious offender toward scientific accuracy and journalistic integrity.

Major studies by groups such as The Harvard Medical School Center for Mental Health, The Journal of Adolescent Health, and The British Medical Journal have shown no significant link between video games and actual player violence. As is made clear in these studies, the old adage "correlation does not imply causation" holds true here, as it should in any honest scientific study. It appears to the author that most if not all of the authors of reports supporting the hypothesis have fallen into the logical fallacy of Post hoc ergo propter hoc: "After this, therefore because of this". The mistake of this is in coming to a conclusion like these authors without considering confounding factors that could rule out any actual connection or causality. Correlations prove very little, if anything, other than a correlation exists. Is it more likely that a violent game begets violence, or that violent children prefer violent games?

Constructing a scientifically valid test of the hypothesis that violent video game play begets actual violence may be nearly impossible: The very definition of "aggression" is difficult to measure objectively. Some studies done with college students in an attempt to measure aggression objectively by allowing players to blast their "opponents" with noise bursts as "punishment" are flawed for example because of the very fact that  it is viewed by the participants as a game. There is no real "punishment", the noise bursts cannot cause any real harm. There is little if any remorse in punishing your opponent if they punish you.

In addition, an accurate definition of "violence" in video games themselves is hard to come by. Is the Madden football series really violent, as some studies state, considering the societal acceptance of the real game of football, where nearly 23 deaths per year are reported on average. Is the children's game Kirby violent? After all, the protagonist swallows their enemies whole, absorbing all of their powers.

I agree wholeheartedly with Nathaniel Edwards in his blogcritics posting: "No matter what a study's results show, the media can be counted on to warp it enough to make it interesting. Typically, this means that headlines claim a greater link between violent media and aggression. There are few details in the actual news stories, and instead there are lots of sweeping claims which don't allow the reader to interpret anything."

There can be no doubt that tragedies such as The Columbine High School massacre and The Virginia Tech massacre are calamities beyond the imaginations of most of us. Expressions of condolences to the victims and their families and friends seem hollow, so horrible were the events. Nonetheless, the immediate links to violent video games made by the media and "experts" were scientifically unfounded.

A particularly bright light of reason and scientific integrity is Christopher J. Ferguson of the Behavioral, Applied Sciences and Criminal Justice department at Texas A&M International University. Professor Ferguson has done extensive research on violent behavior, and published landmark papers specifically covering the aspect from a video game play standpoint, along with various lay articles on the subject.

In his paper titled "The School Shooting/Violent Video Game Link: Causal Relationship or Moral Panic?", Ferguson states "Some scholars have attempted to draw links between laboratory and correlational research on video game playing and school shooting incidents. This paper argues that such claims are faulty and fail to acknowledge the significant methodological and constructional divides between existing video game research and acts of serious aggression and violence. It is concluded that no significant relationship between violent video game exposure and school shooting incidents has been demonstrated in the existing scientific literature, and that data from real world violence call such a link into question." 

In the paper shows how the conclusions reached by the research he studied that support the hypothesis of violent video gaming being causative of real violence are faulty. A most interesting fact revealed in the paper is that while video game play has grown explosively among children and adults, violent crime over the same period has decreased significantly, both for police arrest data and crime victimization data in the U.S., with similar results found in studies from Canada, Australia, the European Union, and the United Kingdom. The interested reader is referred to the link for the paper for details of Ferguson's study.

A more approachable article for the lay reader was published by Ferguson in the September/October 2009 issue of Skeptical Enquirer titled "Violent Video Games: Dogma, Fear, and PseudoscienceIn this article, Ferguson reviews and distills research on both sides of the argument, reaching the same conclusion as found in his academic research and publications: There is no proven link between the playing of violent video games and actual violence by the players of such games, and current research and studies that claim otherwise seem to suffer from severe methodological and other issues that compromise their scientific integrity and usefulness.

In particular, Ferguson shows how these studies typically suffer from severe "citation bias", that is, a failure to honestly report on research that do not support the author's hypothesis. Even more concerning are papers, such as the aforementioned Anderson study, where the authors appear to ignore their own results in order to forward their a priori hypothesis. In particular, Anderson used four measures of aggression in his laboratory studies, and only found a very weak significance for one of them. Had proper statistical techniques been employed by the author, even this weak link would have been shown to be statistically insignificant. Nonetheless, the author chose to ignore the results that did not support his hypothesis, and published a paper based on the single, intrinsically flawed, result set.

Ferguson sums this kind of behavior, unfortunately typical on the side of the argument supporting the hypothesis with "I believe that these authors have ceased functioning as scientists and have begun functioning as activists, becoming their own pressure group. Indeed, in at least one article, the authors appear to actively advocate censorship of academic critics and to encourage psychologists to make more extreme statements in the popular press about violent video-game effects"

With respect to the links made between the tragic school shootings and the fact that the perpetrators played violent video games, Ferguson retorts "It is certainly true that most (although not all) school shooters played violent video games. So do most other boys and young men. Concluding that a school shooter likely played violent video games may seem prescient, but it is not. It is about as predictive as suggesting that they have probably worn sneakers at some time in the past, are able to grow facial hair, have testicles, or anything else that is fairly ubiquitous among males."

That legal nut cases such as Jack Thompson, a particularly troublesome and misinformed activist, have been disbarred for their antics gives a glimmer of hope that sanity will prevail. Just as Elvis is the Devil chants that were the norm now seem ludicrous, it appears the media and science are coming to their senses with respect to video games and violence. A recent paper in the medical journal The Lancet suggests "the time may have come to move beyond professional research focusing on media violence, generally, as a cause of serious aggression."

U.S. courts have blocked laws that attempt to outlaw violent video games, and most recently the U.S. Supreme court justices have agreed to review a California law that attempted to restrict the sales of video games.

We may finally get the legal protection we deserve to allow us to purchase and play the games we chose. And as more researchers like Professor Ferguson present the facts backed with proper research and scientific integrity, the spectre that the media has portrayed regarding violent real life behavior and video games will fade into oblivion.

Doing the Jitterbug: Lag, Latency, Jitter and Throughput and How They Affect Online Gaming (Part I)

Consistency: In an ideal online online gaming world, every player in the game would see the same consistent game world, and every client of a game server would have access to exactly the same game state information at any given instant. No player would suffer from seeing actions in the game world at a later time than other players, or losing a shooting duel because the opponent has a faster, lower lower latency connection to the game server. In the real world, this is of course impossible to achieve with existing technology.

Perhaps the greatest impediment to reaching this goal is caused by the vagaries of the underlying infrastructure that provides the communication between the game clients and servers: the Internet. The path to the goal of perfection is washed away by two primary effects in game communication over the Internet:
  • Throughput limitations - the connection between server and client has some maximum throughput.
  • Network delays - even with unlimited throughput, server messages do not arrive instantly.
The first, throughput limitations (bandwidth), means that the game state transmitted to game clients is at best a sample, or approximation, of the actual complete game world. The bandwidth and processing power simply do not exist in typical environments to allow all of the information required to completely represent the game state to be communicated. Just as the typical music CD has less information than the master recording from which it is made due to the sampling of the audio signal to match the constraints of the CD audio format, so too do the messages that the game clients and servers interchange. By careful crafting of the structure of the messages used, game developers can provide an approximation that is sufficient for realistic game play, so long as the minimum throughput requirements of the game are met.

The second, network delays, means that regardless of the available bandwidth, there is always a delay between the servers sending a message and the client receiving it and vice versa. In effect, the information in the message is time-shifted to a later wall-clock time: an event may happen at actual time X, but by the time the client receives the message of the event, the actual time is perhaps X+2.  When the possibility of time-shifting is present, the possibility of causality violations rears its head: events may seem to occur in the wrong order for game clients. For example, a player may open a door, but that door isn't there in the game world because it was destroyed by a grenade (and seen as such by another online player), an event that the client component of the first player was not yet aware of due to the delays in message arrival. Other players might see the first player's avatar "opening" a door that isn't even there.

We will review the details of these problems, the effect they have for players of online games, and the techniques commonly used to minimize the unfairness between players that can result if these problems are left untreated. We will cover the definitions and details of the problems in this blog entry, with part II to cover the effects these have on players and the techniques used in games to minimize their impact.

Throughput and Latency: 

The throughput requirements for online PC games vary widely, but in general are far below the available bandwidth of the typical client (we will only be discussing the client side and impact, obviously, running a server for a game dictates a much higher aggregate bandwidth requirement). Recent studies (Feng 2002 & 2005) using games such as Counter-Strike, Day of Defeat, Medal of Honor: Allied Assault, and Unreal Tournament 2003 showed a client load ranging from 6400bps to 8000bps for client to server packets and 20800bps to 36000bps for server to client communications. These are far below even lower-tired ISP services typically used by online gamers.

Congestion of the network may cause throughput to drop to a level that is insufficient for smooth game play. Congestion typically occurs in one of three primary areas: the last mile near the user, the middle or Internet cloud, and the last mile on the server side.

In the case of the user-side congestion, they may simply have a service tier that does not provide sufficient bandwidth. This can of course be remedied with a service upgrade. At a minimum, a service with 600kbps down and 50kbps up should suffice. The faster down link speed while not strictly required to play online will ensure faster downloads of game items such as server hosted maps.

The gamer should also ensure that other activities on their local network are not causing congestion. Other users gaming, streaming audio or video, using services such as torrents, etc. can all adversely affect the overall available broadband bandwidth for the player.

Problems in the last mile on the server side can be caused by too many players joining a specific game server, causing a bottleneck on the network link to that server. Game servers typically employ a player count limit to avoid this occurrence. Any other congestion in this link of the game communication network (router congestion or failure modes, etc.) is likely to be out of the control of both the player and their server provider.

Congestion in the Internet cloud is usually temporal: Perhaps a widely viewed sporting event is viewed by large numbers via streaming technologies. As with most last mile issues on the server side, these are out of the control of the server provider and game player. In cases where Internet cloud congestion is the cause of game play issues, the only remedy is to wait until the problem "goes away".

Any kind of congestion, whatever the cause, can cause throughput degradation that may adversely affect the consistency of game play. If the game client is starved of message packets due to actual throughput issues or congestion related throughput issues, the synchronization between the client and server will be lost, resulting in "laggy" game play, "rubber-banding", and other temporal effects. Severe throughput problems can result in the game client "giving up" and disconnecting from the game server.

There is no accepted and commonly agreed upon definition for latency (Delaney 2006). The latency of a network is commonly measured using the ping command. This however measures not the one-way trip from client to server or vice versa, but instead measures the round-trip time. Since the routes from client to server and server to client are usually asymmetric, simply guessing at half the value arrived at from a ping measurement may be grossly inaccurate, and provide incorrect information for making client and server timing decisions. In addition, such a measurement does not account for processing and other delays at the client and server endpoints.

A more useful measurement is the endpoint-to-endpoint measurement of latency that accounts for time needed for client-side processing, bi-directional network delay, and server-side processing (Stead 2008).
This is important: It has been found in studies that much of the delay in the overall game processing loop is caused by the game client handling and processing of messages.

The sources of network delay fall into four basic categories (Kurose 2009):
  • Transmission delay: Packet time to physical layer.
  • Queuing delay: Packet time waiting to be sent to a link.
  • Processing delay: Packet time spent at routers along the route.
  • Propagation delay: Packet time in physical link (bounded by the speed of light).
Transmission delay occurs during the movement of the packet to a physical link. For example, if you are using a 1Mbps WAN connection, each bit takes 1 µs to send, and a 500 byte packet takes 0.5 ms.

Queuing delay can occur at routers along the path of the packets. If a router is is under heavy utilization or the required outbound link is busy, the packet will be queued into a buffer until it can be sent.

Processing delay is also incurred at routers, since these must handle routing table checks, possible firewall rule application, packet check sum and error checking.

Lastly, even if  delays in packet transmission due to processing overhead , transmission delays and queuing delays could be eliminated, we are still bound by the laws of physics. No signal can travel faster than light (2.998x10^8 m/s in vacuo). Speeds in actual transmission media will be lower (e.g. 1.949x10^8 m/s in typical optical fiber, significantly lower for twisted-pair copper). This means we are bounded by an absolute minimum round-trip latency of roughly 2 ms client endpoint to server endpoint and back for a client to server distance of 200 km.

Jitter:

Jitter is the variation in network latency caused by changes in the state of the network. Packets that comprise the communication between the game client and server seldom follow the exact same route endpoint to endpoint. This can cause packets to have different latencies. In addition, network congestion can result in changes in the routing and router buffering behavior, changing the queuing delays for the affected routers.

We can visualize this effect with the aid of a diagram.


In this diagram, packets are sent from the server represented by the lower solid line at regular intervals (time ticks) to the client represented by the upper solid line. If we were able to construct a network with none of the four causes of latency outlined, and in addition discovered a way to violate the laws of physics and send our packets with infinite speed, the green line results: there is no latency between the server sending a packet and the client receiving it.

The more realistic example is represented by the blue line, which shows the slight delay the packet experiences traversing the network from the server to the client. The orange line depicts the next packet in the sequence, which is delayed by the same amount as the packet of the blue line. In the ideal world, the latency from the server to client and vice versa would exhibit this constancy. This would simplify any "compensation" for latency the game developers might wish to utilize, and even without compensation, humans tend to have an easier time adapting to latency in a game when it is relatively constant, even when the latency is rather large (Claypool 2006).

More typically, the game packets experience changes in latency from routing and congestion problems. This is illustrated with the final train of three packets colored red, magenta, and dark brick red. For these packets, it is clear any semblance of packet arrival at relatively regular time ticks is completely lost. There is currently no standard measure for jitter in game traffic. Jitter in networks tends to exhibit randomness, but can be characterized by a Gaussian distribution for inter-packet arrival times (Perkins 2003). Since we are bounded by conditions such as some minimal amounts of processing, queuing, and transmission delay in addition to the absolute bound due to the propagation delay, the actual distribution is biased: there is some absolute minimum that can be realized, and network congestion and related issues can cause delays to be skewed. This is illustrated in the following graph.


Graph of Gaussian (Red) and skewed/biased distributions (Blue) for inter-packet arrival times.

The fit is sufficient that we can use this model for predicting the likelihood of specific inter-packet times for use in the design of compensatory mechanisms for games.

In part II of Doing the Jitterbug, we will investigate what effects these issues have on game play, and what techniques can be used to minimize these effects.

Interested readers can find references for further study after the jump break.

Sunday, May 16, 2010

See Your Way Out - Using mathematics and image processing to quickly pathfind

As I outlined in my blog entry Earning Your Pathfinding Merit Badge: How Game Characters Navigate Their Game World, pathfinding (or path finding) is a critical component in PC games.

Pathfinding refers to the techniques used in games to allow Non-Player Characters (NPC) to navigate the game world.

Most games use some variant of Dijkstra's algorithm, a technique pioneered by Dutch computer scientist Edsger Dijkstra. Dijkstra's algorithm produces a shortest path tree for its result by solving the shortest path problem for a graph with non-negative edge length (the distance between the vertices of the graph are never negative).  It is one of a myriad of graph search (or graph traversal) algorithms.

Dijkstra's algorithm is often used in routing problems, such as network routing protocols, most notably IS-IS and OSPF (Open Shortest Path First), and transportation map shortest path solutions.

Perhaps the most commonly used variant in games is the A* search algorithm. Instead of using the distance from the start node (vertex) of the graph, this variant chooses nodes based on an estimate of the distance from the start node to the destination node. This estimate is formed by adding the known distance to the start node to a guess of the distance to the destination node. This guess is called the heuristic of the algorithm.

By utilizing such a technique, the A* algorithm provides improved performance when compare to the behavior of Dijkstra's algorithm. If the guess is set to zero, the two algorithms are equivalent. When the guess is made positive but less than the true distance to the goal, A* continues to find optimal paths, but because fewer nodes are examined, performance is improved. When the guess exactly matches the actual distance, A* finds the optimal path and does so while examining the minimal possible number of nodes. If the guess is increased yet more, A* continues to find paths, examining fewer nodes still, but no longer guaranteeing an optimal path. In most games, since it is not practical to guarantee a correct heuristic at all times, this is acceptable due to the increase in algorithm performance.

With extremely complex environments (like the maze we'll be looking at), these algorithms can prove to be impractical, with other solutions preferred. A nice overview of such issues can be seen at Sean Barrett's Path Finding blog entry.

I'm going to go through a technique that is conceptually beautiful to me: Using image processing to solve a maze path finding problem.

I'm sure every reader is familiar with mazes. Perhaps some of you have found yourself lost in a particularly fiendish one, having to ring the nearby "rescue bell'!

I'll be using the superb Matlab system from Mathworks.


From their web site: "MATLAB® is a high-level language and interactive environment that enables you to perform computationally intensive tasks faster than with traditional programming languages such as C, C++, and Fortran." 

It most certainly is! Along with the equally superb Mathematica it is the staple of my tool set for doing complex mathematical programming, analysis, graphing, and data manipulation. These two products would be my desert island choices to take with me if I could only have two programming environments.

I'll be using the Matlab image processing package DIPimage, developed by Cris Luengo, Professor in multidimensional image analysis at the Swedish University of Agricultural Sciences and Uppsala University. I've borrowed copiously from his examples. Users of Matlab can obtain a license for DIPimage free of charge (normally a $3200.00 and up cost) for academic, non-commercial use. I highly recommend it!

Let's begin.

We will start with the following simple (no holes, loops, etc.) maze:



Note the entry and exit apertures in the lower right and upper right areas of the outer call of the maze. Try solving it manually if you'd like!

Next, we use our image processing functions to convert our maze to grey-valued, and threshold it to obtain the binary image. The maze walls are now shown in red, with the path areas in black:

maze = colorspace(inputmaze,'grey');
maze = ~threshold(maze)


We use image processing functions to "label" any found walls and show them with differing colors. In this case, we clearly see the maze comprises two walls:

maze = label(maze);
dipshow(maze*2,'labels')


From this we can easily see the path is the area between the two maze walls. By applying a little mathematical morphology, we can isolate one of the walls:

path = maze==1


We then can do a binary dilation on the wall and fill in any holes (effectively a smearing of the wall):

pathwidth = 5;
path = bdilation(path,pathwidth);
path = fillholes(path)


Next, let us erode this smeared version of the wall. This operation effectively "shrinks" the wall by subtracting components smaller than the erosion radius and in the case of a binary image such as ours by removing the perimeter pixels. We then take the difference of the eroded image and the original image:
 
path = path - berosion(path,pathwidth);


Finally, overlaying the result with our original maze shows our desired goal:

overlay(inputmaze,path)


I think you'll agree, a most fascinating use of tools to solve a problem from a domain you'd not likely think to be image processing.

The interested reader can find details of techniques such as these in these excellent references:
 
The Image Processing Handbook by John C. Russ
 
Digital Image Processing Using MATLAB by Rafael C. Gonzalez
 
Feature Extraction & Image Processing by Mark Nixon
 
Happy trails to you!

Saturday, May 15, 2010

The Alpha and The Omega of "Search" Engines?: WolframAlpha

You might have noticed I've embedded WolframAlpha into the heading area of the blog. You might not know what it is.

Stephen Wolfram is one of the brightest people you could ever meet. Most certainly a genius. Educated at Eton, and later attending the University of Oxford he had by the age of eighteen already written cited papers in particle physics and quark production. He received his PhD in particle physics at the California Institute of Technology, where he became a faculty member. There he pursued research in computer science, cellular automata, physics, and cosmology, earning him one of the first awards of the MacArthur Fellows Program, commonly known as the "Genius Award".

While at Caltech, Wolfram worked on the implementation of the computer algebra system SMP, essentially version one of what later became the widely used standard for technical computing, Mathematica. Irascible, opinionated, brash and forceful, Wolfram is not one to shy away from imposing his world views on others. His recent years of research led to the publication of A New Kind Of Science, outlined in my blog entry Life Goes On: Further Ruminations on Conway's Game of Life.

WolframAlpha is Stephen's attempt to provide a generalized "computational knowledge engine". Based on the prodigious capabilities of Mathematica, WolframAlpha simultaneously provides a hyper-capable calculating environment and the ability to query for data, either for the sake of the returned data or for use in calculations. The data used is "curated" by Wolfram Research, providing for a much more reliable level of quality compared to existing search engines.

This is no Google. Asking some generic question is likely to be met with the WolframAlpha equivalent of "Huh?":


Ask the engine something more precise, such as "United States population growth", and the character of the system begins to emerge:


Combine this with calculation capabilities second only to actually purchasing a copy of Mathematica to run locally, and you have truly phenomenal capacity for data gathering, analysis, and calculation at your fingertips.

Here, WolframAlpha shows us the integral of a complex equation integral (sin(theta)/cos(theta)^2^pi):


Give it a try, and visit the site at WolframAlpha for more details.

To give it a whirl right now, try typing "ISS now" (without the quotes) into the WolframAlpha box at the top of the blog entry, and pressing enter  or clicking on the equals sign. Be prepared to be impressed!

Bar Bets and Thermonuclear Bombs

Stanislaw Ulam, along with Edward Teller, fathered the thermonuclear bomb or Hydrogen Bomb as it is more commonly known. Ulam is one of the many great mathematical minds of Polish descent. Known as a rather cantankerous fellow, he made major contributions in the fields of Set Theory, Topology, Ergodic Theory, Biology and Physics. A true polymath.

While residing in the United States during World War II, Ulam was invited to join a secret project in New Mexico by his close friend John von Neumann, one of the greatest mathematicians in history and an epic contributor to the the fields of mathematics, economics, set theory, game theory, logic, quantum mechanics, computer science and nuclear physics. Neumann created the field of Cellular Automata (see Life Goes On: Further Ruminations on Conway's Game of Life for examples) with a pencil and graph paper as tools to build the first self-replicating automata.

With his typical scientific curiosity, Ulam checked out a book on New Mexico from the University of Wisconsin–Madison library, the campus where he was a faculty member at the time. In it, he found the names of all those that had previously disappeared from the campus on the check-out card. He subsequently joined the  Manhattan Project at Los Alamos, where the development of the first atomic bomb was progressing.

As the atomic bomb project was nearing fruition, Teller had turned his attention to the creation of a vastly more powerful tool of destruction, the Hydrogen Bomb, or "Super" as it came to be known. Ulam, made aware of the "super", showed that Teller's design was faulty and devised a much more suitable design. While the design took on the moniker of "the Teller-Ulam design", most involved in the project credit Ulam with being the true father of modern thermonuclear devices, as contrasted to Teller's rumored contribution as the character model for Dr. Strangelove in the eponymous film by Stanley Kubrick.

A particularly interesting contribution of Ulam's to mathematics and topology specifically is the Borsuk-Ulam theorem, first conjectured by Ulam and later proved by Karol Borsuk in 1933. The theorem is stated as follows:

Call a continuous map f:Sm→  Sn antipode preserving if f(x)=f(x) for all x∈  Sm
Theorem: There exists no continuous map f:Sn Sn1 which is antipode preserving for n > 0
In English, this states that any continuous function from an n-sphere to Euclidean n-space maps some pair of antipodal points to the same point.

In even more plain English, I'll try to explain why this results in things that can sound incredulous to most non-mathematicians, and why it provides for the opportunity of a "bar bet" that at the very least might get you a free beer.


Take the circle above with the two labeled points A and B. This will be the only drawing here, to start us off. I'm no artist, and my meager attempts at drawings for the steps would likely confuse rather than help. Get out your pencil and paper, put on your thinking caps, and follow along!

Imagine this is a line drawn around a sphere, such as the idealized Earth. The equator is an example of such a line. In this case, we only require that it is maximal in size, that is, it is drawn around the Earth in a way that it is as large as it can be, like the equator or any of the lines of longitude that run north to south on the globe.

You've perhaps heard the term "Great Circle" used for navigation. A Great Circle is a circle around the earth that is maximal. Any segment of such a circle that connects two points is the shortest distance between those points on the globe. So taking the great circle path is always the shortest route between points on the surface of the Earth.

A shape such as the circle we are using is known mathematically (at least by topologists) as a 1-sphere or Disc: it is a one dimensional surface embedded in two dimensional Euclidean space. What non-mathematicians call a sphere is known as a 2-sphere or ball: A two dimensional surface embedded in three dimensional Euclidean space.

We want to show that we can always find two points, opposite (antipodal) to each other on the planet, that have exactly the same temperature. If we find that points A and B in fact measure the same temperature for the circle we've drawn, we have made our point. But suppose the temperatures are different.

Let's call A the colder measurement point, and B the hotter measurement point. Now imagine moving the measuring points, keeping them diametrically opposed, around the circle until they have exchanged places. Let us assume that the measurements never share the same temperature. Because of this assumption, measurement point A must remain colder than measurement point B for the whole trip, and measurement point B must remain hotter than measurement point A.

But after rotating the measurement points 180°, we know that measurement point B must be measuring a colder temperature than measurement point A and vice versa. This contradicts our assumption, and by the mathematical technique of proof by contradiction we now know that there must have been some point along the path where measurement points A and B showed the same temperature.

The same proof technique can be extended to any 2-sphere/ball like the Earth. When moving to the higher dimensions, we are allowed more degrees of freedom in our chosen path for the antipodal points. We can choose any arbitrary meandering path, so long as the two measuring points remain antipodal.

Imagine we choose some path, again resulting in the exchanging of the positions of the two measurement points. By the same methods, we can see that again, there must be two antipodal points somewhere on that path that share the same measurement. This is critical to our emerging bar bet. This means that every path taken by measurement point A on it's way to measurement point B, with measurement point B following along by remaining diametrically opposed (antipodal) on the opposite site of the Earth must have a point where the two points shared the same temperature.

Let's consider the set of all possible pairs of antipodal points on the Earth with the same temperature. These points need not have the same temperatures, they must only have the same temperature as their matching antipodal point on the opposite side if the Earth to be a member of this special set.

You may be thinking that the result might be some myriad of dots scattered across the surface of the ball where the matching antipodal dot shares the same temperature. But this would be wrong!

If this was the case, we could always find some path from the original measurement point A to measurement point B without needing to touch any of the dots in the set. But as we have already shown, this set must contain a pair of the dots opposite each other on the Earth that share the same temperature along whatever path we take from A to B.

From this, we can make the conclusion that the set of dots must contain some set of dots that result in a continuous path dividing our ball (the Earth) into two sections, one containing the original measurement point A and the other containing measurement point B.


Now imagine applying the same line of thinking to this path, except we'll use some other continuous function such as barometric pressure. We pick two antipodal measurement points, call them C and D, and take our measurement. If the result is the same, we are finished showing the same fact that we proved for temperature holds true for barometric pressure. If they differ, we again let measurement point C follow the path to measurement point D, with point D following along on the opposite side of the globe. As with temperature, we will find some point along the path that has matching measurements for measurement points C and D.

But as we know, this point is on the path that contains all of the points where the temperature measurements are equal for the matching antipodal points of the path.

This means that that point, and its matching antipodal point, simultaneously have the same temperature and barometric pressure! Such points can always be shown to exist on a 2-sphere/ball, in our case the Earth, for any two continuous functions on its surface. We could have just as well used temperature and wind speed, wind speed and barometric pressure, etc.

The Borsuk-Ulman theorem proves that this is true for any pair of continuous functions on any 2-sphere! As can be seen in the theorem, it can be extended to any number of dimensions, with a corresponding increase in the number of continuous functions that will share a pair of antipodal points with equivalent values for each function.

This means for example that when that big plump helium-filled party balloon diffuses to death and is lying on the floor in a crumpled mass, there is a pair of points that were antipodal when the balloon was inflated that are now lying precisely on top of each other. Always.

So the next time you find yourself at the local watering hole with your bright companions, bet them a beer that there are two places on the opposite sides of the Earth, right now, that have exactly the same temperature and barometric pressure.

The bet is almost surely to be taken, and you'll earn yourself a free beer. Or maybe have one tossed at you!

While you might curse Ulam for his contributions to nuclear weaponry, remember to raise a toast to him before drinking your free beer! 

For more uses on this most interesting theorem, the book Using the Borsuk-Ulam Theorem: Lectures on Topological Methods in Combinatorics and Geometry by Jiri Matousek and published by Springer provides a rewarding overview in an approachable text at the elementary level.

Excellent introductory topology books can be had in Topology by K. Jänich (also from Springer) and the classic Topology by James R. Munkres, published by Prentice Hall. 

Topology and Its Applications by William F. Basener published by Wiley provides more examples of useful results from general topology.

The Topology entry at Wolfram Mathworld provides a one page description of the field with copious references. You may find yourself wandering the site following one interesting link after another.



Thursday, May 13, 2010

I've Got A Stalker: S.T.A.L.K.E.R. Shadow of Chernobyl, that is.


S.T.A.L.K.E.R.: Scavenger, Trespasser, Adventurer, Loner, Killer, Explorer, Robber.

Sounds like my neighbor. I remember sometime in 2004, after an all night gaming session on the classic Far Cry in my LAN room, one of my gaming buddies turned to me and asked if I'd seen the web site for an upcoming game, S.T.A.L.K.E.R.: Shadow of Chernobyl. I hadn't, so we took a look.

I was blown away by the description and the screen shots shown, the graphics looked amazing compared to any shooter we'd played up to that time. The rumored release of the game was soon, so the juices started flowing for what looked to be a most excellent addition to the game collection.

Unfortunately, the game faced delay after delay, eventually resulting in a ninth place finish in Wired's Vaporware '06 contest. In January 2007, a contest for players to experience the game beta in a marathon session collapsed when the THQ staff (publishers of the game) that had organized the event were themselves unable to obtain copies of the game.

The game finally made its public debut at the end of March, 2007.

The game is played in an area known as the Zone, based on the Exclusion Zone around the area of the Chernobyl Disaster, with story components borrowed from the novel Roadside Picnic. In the game, a second disaster occurs twenty years later, leading to mutations, anomalies, and the associated valuable artifacts, prime scavenging material for the game characters.

This was the first game I'd ever played that had any kind of real "scavenger" aspect, that is, the "gatherer" part of "hunter-gatherer". This aspect of the game is in fact central to game play. The character you play (I'll not reveal details, as that would be a plot spoiler) starts the game basically buck naked as far as weaponry and protective clothing.

By fulfilling mission requests provided by all sorts of NPC in the game and by obtaining or otherwise finding valuables in the game, the player builds a stock of goods that can be sold or traded for items such as food, weaponry, clothing, ammunition, artifacts, etc.

The artifacts in the game play a key role in trading and player protection:

"As a result of the second Chernobyl disaster, The Zone is littered with small areas of altered physics, known as anomalies. There are several different variations, each one having a unique impact upon those who cross its path. They can be potentially deadly to the player and other NPCs, delivering electric shocks, or pulling them into the air and crushing them.

Most anomalies produce visible air or light distortions and their extent can be determined by throwing bolts (of which the player carries an infinite supply) to trigger them. Some stalkers also possess an anomaly detector, which emits warning beeps of a varying frequency depending on their proximity to an anomaly. The guide in the film Stalker, and his predecessors in the Strugatsky brothers' book Roadside Picnic, test various routes before proceeding. In the film, metal nuts tied with strips of cloth are used.

Anomalies produce Artifacts, the valuable scientific curiosities that make the Zone worth exploring monetarily. As well as being traded for money, a number of Artifacts can be worn so that they provide certain benefits and detriments (for example, increasing a stalker's resistance to gunfire while also contaminating him with small amounts of radiation). Artifacts are found scattered throughout the Zone, often near clusters of anomalies."

At first, I found this aspect of the game boring and time consuming. Little did I know I would soon become addicted to the hunt, particularly for the more rare items. I soon came to appreciate this type of game play, common in the MMORPG games such as World of Warcraft that I'd poked fun at.

Game play covers a huge area of many square miles. Originally, the game was to be completely open, but by release, the map had been subdivided into eighteen areas, each reachable through specific passages. Nonetheless, the game play always feels expansive, with superb draw distances.

The ultimate goal of the player is to reach the Chernobyl reactor itself, to reach the most curious of all of the strange items in the game. The actual end game depends on how the player has progressed through the game.

This, combined with the wide range of choices in interactions and missions (Who do I want to befriend? Who do I decide to trade with? What items do I want for my character?) leads to excellent replay value in the game.

The game is based in the in-house developed X-Ray graphics rendering engine. From the Wikipedia entry:

"The X-ray Engine is a DirectX 8.1/9 Shader Model 3.0 graphics engine. Up to a million polygons can be on-screen at any one time. The engine features HDR rendering, parallax and normal mapping, soft shadows, motion blur, widescreen support, weather effects and day/night cycles. As with other engines that use deferred shading, the X-ray Engine does not support anti-aliasing with dynamic lighting enabled. However, a "fake" form of anti-aliasing can be enabled with the static lighting option; this format utilizes a technique to blur the image to give the false impression of anti-aliasing."

Even in 2010, the graphics hold up well, especially on high-end machines.

The A.I. system was also built in-house. The "ALife" system originally was to have NPC constantly active in the game world, regardless of player interaction. By release, this had been reduced in functionality. Nonetheless, the capabilities are quite robust, as described in the Wikipedia entry:

"GSC Game World's proprietary ALife artificial intelligence engine. ALife supports more than one thousand characters inhabiting the Zone. These characters are non-scripted, meaning that AI life can be developed even when not in contact with the player.

The NPCs have a full life cycle (task accomplishment, combat, rest, feeding and sleep) and the same applies to the many monsters living in the Zone (hunting, attacking stalkers and other monsters, resting, eating, sleeping). These monsters migrate in large groups. The non-scripted nature of the characters means that there are an unlimited number of random quests. For instance, rescuing stalkers from danger, destroying stalker renegades, protecting or attacking stalker camps or searching for treasure. The AI characters travel around the entire zone as they see fit.

Numerous tactics can be employed to complete the game, such as rushing or using stealth and sniping. The NPCs will react in a different way to each of them. S.T.A.L.K.E.R.'s NPCs plan ahead by "Goal-Oriented Action Planning" to achieve this."
 
S.T.A.L.K.E.R. uses a modified version of the ODE physics engine to provide rag doll physics and accurate bullet ballistics: Bullets are affected by gravity and ricochet off of surfaces.
 
Adding to the realism is a completely dynamic day / night progression of time, including weather effects such as rain, lightning, showers and sunlight.
 
An eerie soundtrack completes the palette of game play. This was the first and only game I've played that had parts that really got the creep-o-meeter rising for me. There were many moments exploring the bowels of some haunted deserted underground laboratory where the anticipation of terror had my pulse rising.
 
Even at an age nearing four years, still a most worthwhile game, both for FPS fans and RPG players. That it is now available on Valve's excellent Steam system for under $20.00 makes this a no-brainer if you have not already played it. Since the version is the latest patch, most of the niggling bugs that were in the initial release have been remedied.

The follow-up games from the same developer, S.T.A.L.K.E.R.: Clear Sky released in September 2008 for North America, and S.T.A.L.K.E.R.: Call of Pripyat released in February 2010 fail to capture the magic of the original in my opinion. The former was an unmitigated disaster, bug ridden and having none of the flavor that made the original so engaging. The latter returned to more of the game play mechanics of its progenitor, and is arguably the least bug plagued of the three. Recommended for players that loved the original, but for others only if it can be had at a bargain price.

All three games in the series have an active "modding" community, providing new maps, characters, game and character abilities, and modifications of difficulty levels. This adds considerably to the game play value and longevity of the game.

Like many things in life, this is a case where the original is the best.
 
Highly recommended, grade A material.

The Official Unofficial Bigfoot Challenge: Throwing Down The Gauntlet of Science at Marketing BS.

I decided to move this from my blog entry She Blinded Me WIth Science: Tuning Up Your BS Detectors to make it more easily findable by Bigfoot Networks.

Bigfoot: All ribbing aside, I'm quite serious. Let's put your outlandish claims under the harsh light of scientific scrutiny. Win or lose, a school gets a nice PC, so think of it as an educational opportunity combined with worthwhile charity. You can comment here or e-mail me for the contact information for my counsel, where the contract for the challenge can be fine tuned based on the outline in the challenge below and then finalized.

I look forward to your response.


The Official Unofficial Bigfoot Challenge

(1)
Bigfoot networks ("Bigfoot") will supply a trained network performance expert player (at their expense) of their choice from their pool of network performance experts shills.

(2)
I will provide a newly built, high performance gaming PC, configured with components typically used by gamers with budgets, not to exceed $2000.00 in total costs. Certainly a machine far below what the highly compensated and sponsored professionals would have. You know, the pros making all the glowing statements on the Bigfoot web site.

Such a machine would of course be handicapped in performance compared to the machines of a pro gamer, so any benefit from the Bigfoot technologies would be expected to be magnified. The machine will have a fresh, unmolested installation of Windows 7 64 Bit operating system, patched to the current patch and update state at the time of the challenge.

The tester (me) will install the game to be used for the challenge on the day of the challenge. The game will be patched to the patch level available at that time. Any customization of the game other than that available via normal in-game interfaces will be provided to the tester for review and approval, and will be applied by the tester, subject to verification by the network performance expert player.

(3)
The network performance expert player will play a series of games, not less than 20 games or twenty hours, whichever is greater time. At the start of gaming, and between games or once every hour, the network interface will be switched between the NIC interface provided by the test PC, and the Bigfoot "Killer 2100" NIC. This will be done by an appropriate random selection, known only to the tester (me.)

(4)
 At each game or session start, the identity of the correct and current NIC in use will be noted and placed in a time stamped envelope by the tester.

(5)
The network performance expert player will at the end of each game or session write their guess as to which NIC was in use. In addition, the appropriate score metrics will be noted to provide data on network performance expert player performance during the game. This will be placed into the same envelope, which will then be sealed and be placed into a locked slotted box to prevent tampering.

Alternately, utilizing accepted data security techniques as outlined by Carnac the Magnificent and in light of Bigofoot's likely concerns of this data being 'leaked', the envelopes can be hermetically sealed, placed in a mayonnaise jar, and at a time no later than noon the following day the aforesaid jar shall be placed on Funk & Wagnalls' porch.

(6)
At no time will any employee of Bigfoot be allowed contact or interaction with the test PC, other than access supervised and approved by the tester to allow Bigfoot to validate the correctness and specifications of the test PC build. The network performance expert player will similarly not be allowed access to the machine other than through the mouse and keyboard(s) provided by the network performance expert player.

Specifically, the network performance expert player will not have access to the network connections, which will be switched via a switching unit utilizing a two-into-one topology, so that both NICs on the PC will have cables attached for the duration of the test. The switch box will be viewable only to the tester, and accessible only to the tester.

Any indicators or other applications that provide notification of NIC or network status, performance, or related information will be disabled or made otherwise unavailable.

(6a)
Any and all non-standard or non-default utilization of the network "stack" or operating system by the Bigfoot "Killer 2100" NIC and associated software will be specified in writing and certified as encompassing any and all such utilization by Bigfoot. Such documentation will be provided to the tester at least three days before the challenge test to allow configuration of the alternate NIC and OS as appropriate. This includes, but is not limited to, changes in use of "Nagling", packet fragmentation or coalescing behaviors, any other interaction with the underlying OS environment and network facilities, any interaction with the OS that affects process priorities or other characteristics that may affect system, network, game or process performance that are advantageous to the Bigfoot NIC and that can also be utilized by the alternate NIC but may not be in the default settings for Windows. Such documentation must include any and all system changes made by the installer(s) for Bigfoot related software components, and any and all system changes made by the utilization of Bigfoot related software components.

(6b)
The tester reserves the right to install and utilize monitoring applications to determine system changes made by the installation of any and all Bigfoot related software during the installations process and during the game play testing. This includes but is not limited to Windows registry, process control, and network object monitoring. In addition, the test reserves the right to utilize packet analysis tools during game play testing to ensure that the comparison between NICs uses packets of similar nature.

(6a) and (6b) ensure that the installation and use of the Bigfoot NIC and related software do not cause covert or overt changes to normal Windows OS, Network, or process behavior that would have performance impact that could also apply to the alternate NIC but may not be used by default in Windows. In other words, they ensure that it is the card and associated driver and only the card and associated driver that is affecting performance, if any such effect exists.

Since it would obviously be trivial for an installer, application, or driver to make such changes covertly (e.g., my installer could manipulate the on-board NIC parameters unfavorably, and/or manipulate the parameters for my "super nic" favorably to produce favorable results) and no prior "test" of the Bigfoot products has to my knowledge eliminated this possibility, we will ensure this for this challenge.

The tester reserves the right to install and utilize network performance monitoring tools to record quantitative data to be included with the primary score and NIC identification data. This data, if gathered, will be statistically analyzed to determine if any statistically significant differences exist between the Bigfoot NIC and the alternate NIC. Only open source monitoring tools will be utilized, such that Bigfoot can assess the correctness of the monitoring tool(s). By the same token, any tools that the tester chooses to utilize that are supplied by Bigfoot must be provided with complete source code and build environments, allowing independent validation and compilation of the tool(s). This ensures that no manipulation of measurement or output data is done by the Bigfoot supplied tools.

(7)
At no time when at the PC will the network performance expert player be allowed to utilize any application other than the game played, or directly access any programs, commands, or other services or applications intrinsic to Windows. No performance measurement device or application may be used, nor will the use of any in-game measurement related to network performance be allowed. This is to avoid the network performance expert player from attempting to discern which NIC is in use by means other than game play performance.

The only interactions allowed with the test PC for the network performance expert player will be for activities directly related to the playing of the test game, game configuration, and score data recording.

(8)
I and Bigfoot may agree to a 'disinterested' third party to conduct the actual testing, to include but not limited to the switching of the NICs and recording of data.

(9)
The test will be held at a mutually agreed on location with sufficient network bandwidth. The servers for the games will also utilize providers mutually agreed on by Bigfoot and me. For any game, at least ten providers will be specified, half by the tester and half by Bigfoot. The provider to be used for the tests will be selected by the appropriate random selection process by the tester at the time of the test. If it is determined there will be a LAN component to the challenge tests, the PC used for the server must use a standard retail NIC other than Bigfoot products. Any LAN server PC if used will only have a fresh, default Windows 7 installation and the default installation files and configuration for the game. This is to reduce the possibility of collusion with game server providers and to eliminate the possibility of any covert or overt use of communication techniques dependent on any form of endpoint cooperation and only available when Bigfoot products are at both endpoints of the network.

I humbly suggest the Google campus for the test. Sergey and Larry will surely accommodate us, and they both enjoy a good scientific venture. Heck, they might even help us televise it!

(10)
The Bigfoot "Killer 2100" NIC will be a retail model, purchased by the tester, as part of the test PC build.

No information, such as the MAC of the NIC will be made available to Bigfoot until the termination of the test. Both NICs will use a 'spoofed' MAC during game play, provided and controlled by the tester, to ensure that no identification of which NIC is in use can be attempted by packet analysis.

(11)
The data will be independently analyzed by an agency approved by both parties as to statistical significance of the data gathered.

(12)
The results will be published on the home / front page of the Bigfoot Networks web site. They will remain there for a period not less than one hundred eighty (180) days.

Any changes to the domain name or other infrastructure that would prevent access to this information by visitors to the site will provide for the needed changes to the published results to ensure they remain visible.

(13)
This item is not part of the challenge requirements. It's just a place holder for a "Hello!" to the apparently underpaid lawyers of Bigfoot that will surely be asked to review this and figure out a way to have some kind of "take down" issued so Bigfoot doesn't have to explain why they didn't take up the challenge. Hello!

(14)
If there is a statistically significant network performance expert player performance benefit shown when using the "Bigfoot Killer 2100" NIC, and the network performance expert player shows a statistically significant ability to correctly identify the NIC that was used for each session, I will donate the complete PC to the school or other educational facility of Bigfoot's choice.

(15)
If no statistically significant network performance expert player performance benefit is seen when utilizing the Bigfoot "Killer 2100" NIC or the network performance expert player is unable to identify which NIC was in use for each session to a statistically significant level,  Bigfoot agrees to reimburse the author (me) for all costs incurred for the test PC, which I will then donate to the school of my choice.

(16)
Any violation of the rules, specifications and stipulations agreed upon will constitute a forfeiture by the violator.

Taking a Bite from the Poison Apple: Gaming on a MAC.

I like Apple. I like the Macintosh in all its current incarnations, from mini to laptop to traditional deskside form factors (am I allowed to call it Macintosh, or is it just Mac now? I'm not sure what the current cult protocol dictates.)

I like the marketing of the company: it is consistently slick, polished, and clever.

I like the superiority complex that emanates from Apple's cultish leader Steve Jobs and that seems to ooze out of the corporation and infect many of the consumers of the products. It gives my daughter and me endless fun to watch the iMonkeys at the Apple stores work their conversion magic on prospective cult members, like Kaa working Mowgli (for some reason, things like this seem funnier to me in German.)

I like Steve. Steve Wozniak, that is. Without him, there would be no Apple. A hilarious guy, as down to earth as they get. And brilliant. I like the Jobs version of Steve too, but he couldn't code his way out of a paper bag, and I just have no interest in having a coffee and shooting the breeze with him like I do with "The Woz". Woz has donated huge sums of money and technology to local schools. Education for our children is important to him, and I admire that deeply. He dated Kathy Griffin. That must have been a hoot.

I've enjoyed the rage tantrums of the Jobs incarnation of Steve since my own personal experience with one twenty some years ago when he asked me the wrong question.

"What do you think of our super cool NeXT hardware?" he asked me. I responded with something like "It's an already outdated steaming turd that is only falling further behind Intel based stuff each day. Dump it. Dump any of the business that's hardware. Put NeXTSTEP on Intel, it's so much better than Windows, you'll rule the world." You'd have thought I'd told the Pope that God was a phony.

By the time NeXT got their act in gear and dropped the hardware facade and made NeXTSTEP available for commodity hardware, it was too late: Windows had evolved and had grown in market share to an insurmountable lead. What a disaster, a ship steered by an ego blinded captain. Not his first flop, nor will it be his last in my opinion. But make no mistake: Jobs could sell $50.00 bags of ice cubes to Eskimos, and have them believing his version tastes better. You've got to admire that.

I like that since hardly anyone uses the Mac product (under 4% of the market), hackers don't waste their time on attacking the machine. I don't like that Apple markets this under the guise of their machine being more secure than Windows machines: security through obscurity is useless, and the fact that Apple hardware is consistently the first to fall at white hat security hacking contests demonstrates this. Nonetheless, in the same way that no city burglar is going to go out of his way and drive a hundred miles into the countryside just to rob you, the platform is much less likely to find itself under attack. For now at least.

I like that the very narrow range of hardware options used (and controlled) by Apple makes life easier for their OS developers. Stuff just works. That Windows works so well with the innumerable combinations of hardware it can be installed on is miraculous, Apple chose to simplify the problem and has done a superb job of it.

I like the support infrastructure of Apple. This is one area that I've yet to see anything even close in the traditional PC world. The Apple store staff knows their stuff. The genius bar really knows their stuff. A user of the product can get face to face, quality technical support for zero or minimal cost, instead of spending hours talking or on-line chatting with tech support via some version of the typical cookie cutter outsourced staff.

All of this boils down to this: The Mac is the machine least likely to be bought by me, and the most likely to be recommended by me. Except to serious gamers. Allow me to explain.

When friends come to me seeking advice for a PC (I'll use the term generically to mean both traditional hardware and that of the Apple persuasion), I ask them some pretty simple questions.

If the answers indicate that they do not have a need for the latest and greatest hardware, add-in cards, etc. I usually point them to Apple. The machines are easy to learn and use for the novice.  They tend to "just work" due to Apple's vice-like grip on the narrow range of hardware components allowed in the machine. The support infrastructure Apple provides means if I'm not around to answer a question, a quick phone call to Apple or a visit to the local store will usually result in rapid resolution. Keeps them off my back.

But for the serious gamer? Well, as Apollo 13 hero Jim Lovell said, "I believe we've had a problem here."

There are a few things standing in the way of the gamer using a Mac that wants state of the art hardware for maximum performance with modern games.

Firstly, excepting the deskside Mac Pro models, there is no real means to update the anemic graphics hardware in the Apple machines. Some of the higher-end MacBook models are capable of running games with acceptable frame rates, but the really sophisticated bleeding edge titles are off-limits if acceptable performance is expected.

Even with the Mac Pro, graphics hardware options are severely limited if  the user wants to retain the full range of Apple specific functionality and sensibilities in the interface from the time of powering up the machine: the cards used by Apple require proprietary firmware (for EFI and the OS), meaning non-certified cards will not function properly in OSX, nor will proper boot screen functionality be retained.

This means the user is limited to Apple specific cards if they wish to retain these capabilities and functionality, and these cards tend to severely lag behind the current state of the art as far as performance capabilities. By way of example, the fastest card at the time of writing on the Apple Mac Pro web page is the ATI Radeon HD 4870, a card released two years ago.While there are some third-party cards of higher specification available, these too are at least a generation behind the state of the art. And of course, either solution carries the burden of the "Apple tax": you will pay more for the same card compared to the PC version.

It is possible to do what is effectively "brain surgery" on more modern cards via firmware manipulation to enable use in a Mac Pro, but the likelihood of reduced functionality and performance or of producing an expensive paperweight by such antics far outweighs the benefits. See the entries at everymac.com and the Expansion Cards and Hardware Compatibiltiy sections of the Wikipedia entry for the Mac Pro for a glimpse into the grief faced by users that need more GPU horsepower in the Mac environment.

Yet even then, the Mac user is boxed in: the latest high performance GPUs are quite power hungry. One may tax the power supply of the Mac Pro, dual cards (SLI or Crossfire) would be out of the question without cobbling some sort of Frankenstein power supply to supplement or supplant the one that comes in the machine.

Secondly, the Mac gamer is faced with the reality that mirrors the disinterest in the Mac by hackers: By and large, game developers don't give a hoot about the Apple environment. The Apple store lists 70 titles. Total. A tiny handful of FPS games.

This means that the Mac owner, if they want to play most any current game, will need to do so using Microsoft Windows. No need to rush out and buy a PC to supplant your Mac because it can't do something you want, however. There are a few ways a Mac owner can play Windows games utilizing their Mac hardware. We'll outline these here.

For simple games (2D, scrollers, etc.) with lightweight graphics performance requirements. a virtual machine environment such as Parallels Desktop or Vmware Fusion will allow the user to install Microsoft Windows and the games they desire into a Virtual Machine. This effectively creates a "computer in the computer", and for simple games will allow the user to play the game without leaving the OSX environment. My own experiments show reasonable performance on a higher-end Mac Pro, so long as the game's graphical requirements are kept reasonable. For games with more rigorous requirements, the performance in a virtual environment severely lags behind that of running on native hardware.

For these kinds of games, the user will need to install Windows in a fashion that allows for native booting. This can be accomplished with Apple's own Boot Camp or in a more flexible but more involved implementation using rEFIt.

Boot Camp provides a trivially simple mechanism for the gamer to get up and running on Windows games on their Mac hardware. The "Boot Camp Assistant" of the installer will walk the user through automatic repartitioning of the disk and installation of Windows. The current OSX install discs contain the hardware drivers for Windows components of the user's Mac, simplifying the installation and configuration of Windows considerably: no need to ferret these out from the web. The details and guides for using Boot Camp can be found at the Apple support page for the product.

rEFIt is an open source EFI boot loader and toolkit for the Mac. Unlike Boot Camp, which is limited to one alternate installation of Windows alongside OSX, rEFIt gives the user the functionality of a traditional boot manager, allowing the installation of multiple OS systems. On power up, the user is presented with a boot menu showing what operating systems are available for selection.

rEFIt Boot Screen

I use rEFIt to run Windows, Linux and OSX. The installation of rEFIt does not have the typical Apple hand holding: it's not an Apple product, after all. That said, the installation and configuration are fairly trivial, and any gamer that has built their own machine should have no trouble getting up and running under rEFIt. The canonical source for installation and configuration is the Sourceforge page for rEFIt.

For either the Boot Camp or rEFIt solutions, I would recommend the gamer determine the precise hardware configuration of their Mac and acquire the latest Windows hardware drivers from the appropriate sources before starting the OS installation process. Often only the most current drivers will provide the desired gaming experience for the newest games (graphics card drivers being particularly fickle.). At the very least, ensure that you have the needed network (NIC) driver, so that once Windows is installed and booted, you can retrieve other needed drivers from the Internet.

You'll also want to get your hands on a decent keyboard and mouse. While the Apple realizations exude a Bang & Olufsen elegance, they're utterly useless for any kind of real gaming.

See you on the battlefields!