"It was almost frightening, the reaction... from teenage boys"
"a threat to the moral well being"
"fosters almost totally negative and destructive reactions in young people"
"deplorable"
"definite danger to the security of the United States"
Reactions to video games considered violent? No, these are quotes from things said about Elvis Presely, and "rock and roll" music only a few decades ago. How quaint, and how ridiculous the hype and overreaction seem to us now.
The same kinds of overreaction and hype have been seen from society at large and "experts" before: Television, violent movies, comic books, "professional" wrestling to name a few.
The current poster child of the extremists these days seems to be "violent" video games, that is, video games that graphically portray violence, gore, criminal behavior, or other material considered by some to be "provocative" or otherwise objectionable. Opinion and studies have attempted to link such games to actual aggression and addiction by the players.
In this blog entry, I'm going to argue that these opinions and "studies" do not hold water.
According to statistics reported by the Entertainment Software Association 68% of US households play video games, but only about a quarter of video game players are under the age of 18: the majority of players are adults, which in fact represent the fastest growing demographic of players at around 35% of U.S. households. In fact, the average age of players is 35, and the most frequent game purchaser is one 39 years old. An amazing fact to this author is that by 2009, 25% of video game players were over 50 years old - we might have to add a key for "walker" in addition to the sprint key...hardly children, I think you'll agree.
In addition, 92% of the players under 18 (children) report that their parents are present when the purchase or rental of video games are made, and nearly two-thirds of parents believe video games are a positive part of their children's lives.
And yet, studies such as "Effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, and prosocial behavior: A meta-analytic review of the scientific literature" by Anderson, "Aggression and psychopathology in adolescents with a preference for violent electronic games" by Funk et al., and "Violent video games: The newest media violence hazard" by Gentile all claim to show links between the playing of violent video games and actual aggressive and violent behavior by the players.
Is there such a link? As in any kind of investigative science such as this, there will be differing views. I believe a careful examination of reports such as those listed and a comparison with those holding opposing views must lead one to the conclusion that reports linking video game violence to actual violence suffer from poor methodologies, ignoring negative results, failing to cite academic research with differing views, and improper use of meta-analytical techniques. Craig A. Anderson, author of one of the more comprehensive papers supporting the hypothesis, and a witness before the U.S. Senate on the issue, seems to be a particularly egregious offender toward scientific accuracy and journalistic integrity.
Major studies by groups such as The Harvard Medical School Center for Mental Health, The Journal of Adolescent Health, and The British Medical Journal have shown no significant link between video games and actual player violence. As is made clear in these studies, the old adage "correlation does not imply causation" holds true here, as it should in any honest scientific study. It appears to the author that most if not all of the authors of reports supporting the hypothesis have fallen into the logical fallacy of Post hoc ergo propter hoc: "After this, therefore because of this". The mistake of this is in coming to a conclusion like these authors without considering confounding factors that could rule out any actual connection or causality. Correlations prove very little, if anything, other than a correlation exists. Is it more likely that a violent game begets violence, or that violent children prefer violent games?
Constructing a scientifically valid test of the hypothesis that violent video game play begets actual violence may be nearly impossible: The very definition of "aggression" is difficult to measure objectively. Some studies done with college students in an attempt to measure aggression objectively by allowing players to blast their "opponents" with noise bursts as "punishment" are flawed for example because of the very fact that it is viewed by the participants as a game. There is no real "punishment", the noise bursts cannot cause any real harm. There is little if any remorse in punishing your opponent if they punish you.
In addition, an accurate definition of "violence" in video games themselves is hard to come by. Is the Madden football series really violent, as some studies state, considering the societal acceptance of the real game of football, where nearly 23 deaths per year are reported on average. Is the children's game Kirby violent? After all, the protagonist swallows their enemies whole, absorbing all of their powers.
I agree wholeheartedly with Nathaniel Edwards in his blogcritics posting: "No matter what a study's results show, the media can be counted on to warp it enough to make it interesting. Typically, this means that headlines claim a greater link between violent media and aggression. There are few details in the actual news stories, and instead there are lots of sweeping claims which don't allow the reader to interpret anything."
There can be no doubt that tragedies such as The Columbine High School massacre and The Virginia Tech massacre are calamities beyond the imaginations of most of us. Expressions of condolences to the victims and their families and friends seem hollow, so horrible were the events. Nonetheless, the immediate links to violent video games made by the media and "experts" were scientifically unfounded.
A particularly bright light of reason and scientific integrity is Christopher J. Ferguson of the Behavioral, Applied Sciences and Criminal Justice department at Texas A&M International University. Professor Ferguson has done extensive research on violent behavior, and published landmark papers specifically covering the aspect from a video game play standpoint, along with various lay articles on the subject.
In his paper titled "The School Shooting/Violent Video Game Link: Causal Relationship or Moral Panic?", Ferguson states "Some scholars have attempted to draw links between laboratory and correlational research on video game playing and school shooting incidents. This paper argues that such claims are faulty and fail to acknowledge the significant methodological and constructional divides between existing video game research and acts of serious aggression and violence. It is concluded that no significant relationship between violent video game exposure and school shooting incidents has been demonstrated in the existing scientific literature, and that data from real world violence call such a link into question."
In the paper shows how the conclusions reached by the research he studied that support the hypothesis of violent video gaming being causative of real violence are faulty. A most interesting fact revealed in the paper is that while video game play has grown explosively among children and adults, violent crime over the same period has decreased significantly, both for police arrest data and crime victimization data in the U.S., with similar results found in studies from Canada, Australia, the European Union, and the United Kingdom. The interested reader is referred to the link for the paper for details of Ferguson's study.
A more approachable article for the lay reader was published by Ferguson in the September/October 2009 issue of Skeptical Enquirer titled "Violent Video Games: Dogma, Fear, and Pseudoscience" In this article, Ferguson reviews and distills research on both sides of the argument, reaching the same conclusion as found in his academic research and publications: There is no proven link between the playing of violent video games and actual violence by the players of such games, and current research and studies that claim otherwise seem to suffer from severe methodological and other issues that compromise their scientific integrity and usefulness.
In particular, Ferguson shows how these studies typically suffer from severe "citation bias", that is, a failure to honestly report on research that do not support the author's hypothesis. Even more concerning are papers, such as the aforementioned Anderson study, where the authors appear to ignore their own results in order to forward their a priori hypothesis. In particular, Anderson used four measures of aggression in his laboratory studies, and only found a very weak significance for one of them. Had proper statistical techniques been employed by the author, even this weak link would have been shown to be statistically insignificant. Nonetheless, the author chose to ignore the results that did not support his hypothesis, and published a paper based on the single, intrinsically flawed, result set.
Ferguson sums this kind of behavior, unfortunately typical on the side of the argument supporting the hypothesis with "I believe that these authors have ceased functioning as scientists and have begun functioning as activists, becoming their own pressure group. Indeed, in at least one article, the authors appear to actively advocate censorship of academic critics and to encourage psychologists to make more extreme statements in the popular press about violent video-game effects"
With respect to the links made between the tragic school shootings and the fact that the perpetrators played violent video games, Ferguson retorts "It is certainly true that most (although not all) school shooters played violent video games. So do most other boys and young men. Concluding that a school shooter likely played violent video games may seem prescient, but it is not. It is about as predictive as suggesting that they have probably worn sneakers at some time in the past, are able to grow facial hair, have testicles, or anything else that is fairly ubiquitous among males."
That legal nut cases such as Jack Thompson, a particularly troublesome and misinformed activist, have been disbarred for their antics gives a glimmer of hope that sanity will prevail. Just as Elvis is the Devil chants that were the norm now seem ludicrous, it appears the media and science are coming to their senses with respect to video games and violence. A recent paper in the medical journal The Lancet suggests "the time may have come to move beyond professional research focusing on media violence, generally, as a cause of serious aggression."
U.S. courts have blocked laws that attempt to outlaw violent video games, and most recently the U.S. Supreme court justices have agreed to review a California law that attempted to restrict the sales of video games.
We may finally get the legal protection we deserve to allow us to purchase and play the games we chose. And as more researchers like Professor Ferguson present the facts backed with proper research and scientific integrity, the spectre that the media has portrayed regarding violent real life behavior and video games will fade into oblivion.
Showing posts with label Gaming. Show all posts
Showing posts with label Gaming. Show all posts
Monday, May 17, 2010
Doing the Jitterbug: Lag, Latency, Jitter and Throughput and How They Affect Online Gaming (Part I)
Consistency: In an ideal online online gaming world, every player in the game would see the same consistent game world, and every client of a game server would have access to exactly the same game state information at any given instant. No player would suffer from seeing actions in the game world at a later time than other players, or losing a shooting duel because the opponent has a faster, lower lower latency connection to the game server. In the real world, this is of course impossible to achieve with existing technology.
Perhaps the greatest impediment to reaching this goal is caused by the vagaries of the underlying infrastructure that provides the communication between the game clients and servers: the Internet. The path to the goal of perfection is washed away by two primary effects in game communication over the Internet:
The second, network delays, means that regardless of the available bandwidth, there is always a delay between the servers sending a message and the client receiving it and vice versa. In effect, the information in the message is time-shifted to a later wall-clock time: an event may happen at actual time X, but by the time the client receives the message of the event, the actual time is perhaps X+2. When the possibility of time-shifting is present, the possibility of causality violations rears its head: events may seem to occur in the wrong order for game clients. For example, a player may open a door, but that door isn't there in the game world because it was destroyed by a grenade (and seen as such by another online player), an event that the client component of the first player was not yet aware of due to the delays in message arrival. Other players might see the first player's avatar "opening" a door that isn't even there.
We will review the details of these problems, the effect they have for players of online games, and the techniques commonly used to minimize the unfairness between players that can result if these problems are left untreated. We will cover the definitions and details of the problems in this blog entry, with part II to cover the effects these have on players and the techniques used in games to minimize their impact.
Throughput and Latency:
The throughput requirements for online PC games vary widely, but in general are far below the available bandwidth of the typical client (we will only be discussing the client side and impact, obviously, running a server for a game dictates a much higher aggregate bandwidth requirement). Recent studies (Feng 2002 & 2005) using games such as Counter-Strike, Day of Defeat, Medal of Honor: Allied Assault, and Unreal Tournament 2003 showed a client load ranging from 6400bps to 8000bps for client to server packets and 20800bps to 36000bps for server to client communications. These are far below even lower-tired ISP services typically used by online gamers.
Congestion of the network may cause throughput to drop to a level that is insufficient for smooth game play. Congestion typically occurs in one of three primary areas: the last mile near the user, the middle or Internet cloud, and the last mile on the server side.
In the case of the user-side congestion, they may simply have a service tier that does not provide sufficient bandwidth. This can of course be remedied with a service upgrade. At a minimum, a service with 600kbps down and 50kbps up should suffice. The faster down link speed while not strictly required to play online will ensure faster downloads of game items such as server hosted maps.
The gamer should also ensure that other activities on their local network are not causing congestion. Other users gaming, streaming audio or video, using services such as torrents, etc. can all adversely affect the overall available broadband bandwidth for the player.
Problems in the last mile on the server side can be caused by too many players joining a specific game server, causing a bottleneck on the network link to that server. Game servers typically employ a player count limit to avoid this occurrence. Any other congestion in this link of the game communication network (router congestion or failure modes, etc.) is likely to be out of the control of both the player and their server provider.
Congestion in the Internet cloud is usually temporal: Perhaps a widely viewed sporting event is viewed by large numbers via streaming technologies. As with most last mile issues on the server side, these are out of the control of the server provider and game player. In cases where Internet cloud congestion is the cause of game play issues, the only remedy is to wait until the problem "goes away".
Any kind of congestion, whatever the cause, can cause throughput degradation that may adversely affect the consistency of game play. If the game client is starved of message packets due to actual throughput issues or congestion related throughput issues, the synchronization between the client and server will be lost, resulting in "laggy" game play, "rubber-banding", and other temporal effects. Severe throughput problems can result in the game client "giving up" and disconnecting from the game server.
There is no accepted and commonly agreed upon definition for latency (Delaney 2006). The latency of a network is commonly measured using the ping command. This however measures not the one-way trip from client to server or vice versa, but instead measures the round-trip time. Since the routes from client to server and server to client are usually asymmetric, simply guessing at half the value arrived at from a ping measurement may be grossly inaccurate, and provide incorrect information for making client and server timing decisions. In addition, such a measurement does not account for processing and other delays at the client and server endpoints.
A more useful measurement is the endpoint-to-endpoint measurement of latency that accounts for time needed for client-side processing, bi-directional network delay, and server-side processing (Stead 2008).
This is important: It has been found in studies that much of the delay in the overall game processing loop is caused by the game client handling and processing of messages.
The sources of network delay fall into four basic categories (Kurose 2009):
Queuing delay can occur at routers along the path of the packets. If a router is is under heavy utilization or the required outbound link is busy, the packet will be queued into a buffer until it can be sent.
Processing delay is also incurred at routers, since these must handle routing table checks, possible firewall rule application, packet check sum and error checking.
Lastly, even if delays in packet transmission due to processing overhead , transmission delays and queuing delays could be eliminated, we are still bound by the laws of physics. No signal can travel faster than light (2.998x10^8 m/s in vacuo). Speeds in actual transmission media will be lower (e.g. 1.949x10^8 m/s in typical optical fiber, significantly lower for twisted-pair copper). This means we are bounded by an absolute minimum round-trip latency of roughly 2 ms client endpoint to server endpoint and back for a client to server distance of 200 km.
Jitter:
Jitter is the variation in network latency caused by changes in the state of the network. Packets that comprise the communication between the game client and server seldom follow the exact same route endpoint to endpoint. This can cause packets to have different latencies. In addition, network congestion can result in changes in the routing and router buffering behavior, changing the queuing delays for the affected routers.
We can visualize this effect with the aid of a diagram.
In this diagram, packets are sent from the server represented by the lower solid line at regular intervals (time ticks) to the client represented by the upper solid line. If we were able to construct a network with none of the four causes of latency outlined, and in addition discovered a way to violate the laws of physics and send our packets with infinite speed, the green line results: there is no latency between the server sending a packet and the client receiving it.
The more realistic example is represented by the blue line, which shows the slight delay the packet experiences traversing the network from the server to the client. The orange line depicts the next packet in the sequence, which is delayed by the same amount as the packet of the blue line. In the ideal world, the latency from the server to client and vice versa would exhibit this constancy. This would simplify any "compensation" for latency the game developers might wish to utilize, and even without compensation, humans tend to have an easier time adapting to latency in a game when it is relatively constant, even when the latency is rather large (Claypool 2006).
More typically, the game packets experience changes in latency from routing and congestion problems. This is illustrated with the final train of three packets colored red, magenta, and dark brick red. For these packets, it is clear any semblance of packet arrival at relatively regular time ticks is completely lost. There is currently no standard measure for jitter in game traffic. Jitter in networks tends to exhibit randomness, but can be characterized by a Gaussian distribution for inter-packet arrival times (Perkins 2003). Since we are bounded by conditions such as some minimal amounts of processing, queuing, and transmission delay in addition to the absolute bound due to the propagation delay, the actual distribution is biased: there is some absolute minimum that can be realized, and network congestion and related issues can cause delays to be skewed. This is illustrated in the following graph.
The fit is sufficient that we can use this model for predicting the likelihood of specific inter-packet times for use in the design of compensatory mechanisms for games.
In part II of Doing the Jitterbug, we will investigate what effects these issues have on game play, and what techniques can be used to minimize these effects.
Interested readers can find references for further study after the jump break.
Perhaps the greatest impediment to reaching this goal is caused by the vagaries of the underlying infrastructure that provides the communication between the game clients and servers: the Internet. The path to the goal of perfection is washed away by two primary effects in game communication over the Internet:
- Throughput limitations - the connection between server and client has some maximum throughput.
- Network delays - even with unlimited throughput, server messages do not arrive instantly.
The second, network delays, means that regardless of the available bandwidth, there is always a delay between the servers sending a message and the client receiving it and vice versa. In effect, the information in the message is time-shifted to a later wall-clock time: an event may happen at actual time X, but by the time the client receives the message of the event, the actual time is perhaps X+2. When the possibility of time-shifting is present, the possibility of causality violations rears its head: events may seem to occur in the wrong order for game clients. For example, a player may open a door, but that door isn't there in the game world because it was destroyed by a grenade (and seen as such by another online player), an event that the client component of the first player was not yet aware of due to the delays in message arrival. Other players might see the first player's avatar "opening" a door that isn't even there.
We will review the details of these problems, the effect they have for players of online games, and the techniques commonly used to minimize the unfairness between players that can result if these problems are left untreated. We will cover the definitions and details of the problems in this blog entry, with part II to cover the effects these have on players and the techniques used in games to minimize their impact.
Throughput and Latency:
The throughput requirements for online PC games vary widely, but in general are far below the available bandwidth of the typical client (we will only be discussing the client side and impact, obviously, running a server for a game dictates a much higher aggregate bandwidth requirement). Recent studies (Feng 2002 & 2005) using games such as Counter-Strike, Day of Defeat, Medal of Honor: Allied Assault, and Unreal Tournament 2003 showed a client load ranging from 6400bps to 8000bps for client to server packets and 20800bps to 36000bps for server to client communications. These are far below even lower-tired ISP services typically used by online gamers.
Congestion of the network may cause throughput to drop to a level that is insufficient for smooth game play. Congestion typically occurs in one of three primary areas: the last mile near the user, the middle or Internet cloud, and the last mile on the server side.
In the case of the user-side congestion, they may simply have a service tier that does not provide sufficient bandwidth. This can of course be remedied with a service upgrade. At a minimum, a service with 600kbps down and 50kbps up should suffice. The faster down link speed while not strictly required to play online will ensure faster downloads of game items such as server hosted maps.
The gamer should also ensure that other activities on their local network are not causing congestion. Other users gaming, streaming audio or video, using services such as torrents, etc. can all adversely affect the overall available broadband bandwidth for the player.
Problems in the last mile on the server side can be caused by too many players joining a specific game server, causing a bottleneck on the network link to that server. Game servers typically employ a player count limit to avoid this occurrence. Any other congestion in this link of the game communication network (router congestion or failure modes, etc.) is likely to be out of the control of both the player and their server provider.
Congestion in the Internet cloud is usually temporal: Perhaps a widely viewed sporting event is viewed by large numbers via streaming technologies. As with most last mile issues on the server side, these are out of the control of the server provider and game player. In cases where Internet cloud congestion is the cause of game play issues, the only remedy is to wait until the problem "goes away".
Any kind of congestion, whatever the cause, can cause throughput degradation that may adversely affect the consistency of game play. If the game client is starved of message packets due to actual throughput issues or congestion related throughput issues, the synchronization between the client and server will be lost, resulting in "laggy" game play, "rubber-banding", and other temporal effects. Severe throughput problems can result in the game client "giving up" and disconnecting from the game server.
There is no accepted and commonly agreed upon definition for latency (Delaney 2006). The latency of a network is commonly measured using the ping command. This however measures not the one-way trip from client to server or vice versa, but instead measures the round-trip time. Since the routes from client to server and server to client are usually asymmetric, simply guessing at half the value arrived at from a ping measurement may be grossly inaccurate, and provide incorrect information for making client and server timing decisions. In addition, such a measurement does not account for processing and other delays at the client and server endpoints.
A more useful measurement is the endpoint-to-endpoint measurement of latency that accounts for time needed for client-side processing, bi-directional network delay, and server-side processing (Stead 2008).
This is important: It has been found in studies that much of the delay in the overall game processing loop is caused by the game client handling and processing of messages.
The sources of network delay fall into four basic categories (Kurose 2009):
- Transmission delay: Packet time to physical layer.
- Queuing delay: Packet time waiting to be sent to a link.
- Processing delay: Packet time spent at routers along the route.
- Propagation delay: Packet time in physical link (bounded by the speed of light).
Queuing delay can occur at routers along the path of the packets. If a router is is under heavy utilization or the required outbound link is busy, the packet will be queued into a buffer until it can be sent.
Processing delay is also incurred at routers, since these must handle routing table checks, possible firewall rule application, packet check sum and error checking.
Lastly, even if delays in packet transmission due to processing overhead , transmission delays and queuing delays could be eliminated, we are still bound by the laws of physics. No signal can travel faster than light (2.998x10^8 m/s in vacuo). Speeds in actual transmission media will be lower (e.g. 1.949x10^8 m/s in typical optical fiber, significantly lower for twisted-pair copper). This means we are bounded by an absolute minimum round-trip latency of roughly 2 ms client endpoint to server endpoint and back for a client to server distance of 200 km.
Jitter:
Jitter is the variation in network latency caused by changes in the state of the network. Packets that comprise the communication between the game client and server seldom follow the exact same route endpoint to endpoint. This can cause packets to have different latencies. In addition, network congestion can result in changes in the routing and router buffering behavior, changing the queuing delays for the affected routers.
We can visualize this effect with the aid of a diagram.
In this diagram, packets are sent from the server represented by the lower solid line at regular intervals (time ticks) to the client represented by the upper solid line. If we were able to construct a network with none of the four causes of latency outlined, and in addition discovered a way to violate the laws of physics and send our packets with infinite speed, the green line results: there is no latency between the server sending a packet and the client receiving it.
The more realistic example is represented by the blue line, which shows the slight delay the packet experiences traversing the network from the server to the client. The orange line depicts the next packet in the sequence, which is delayed by the same amount as the packet of the blue line. In the ideal world, the latency from the server to client and vice versa would exhibit this constancy. This would simplify any "compensation" for latency the game developers might wish to utilize, and even without compensation, humans tend to have an easier time adapting to latency in a game when it is relatively constant, even when the latency is rather large (Claypool 2006).
More typically, the game packets experience changes in latency from routing and congestion problems. This is illustrated with the final train of three packets colored red, magenta, and dark brick red. For these packets, it is clear any semblance of packet arrival at relatively regular time ticks is completely lost. There is currently no standard measure for jitter in game traffic. Jitter in networks tends to exhibit randomness, but can be characterized by a Gaussian distribution for inter-packet arrival times (Perkins 2003). Since we are bounded by conditions such as some minimal amounts of processing, queuing, and transmission delay in addition to the absolute bound due to the propagation delay, the actual distribution is biased: there is some absolute minimum that can be realized, and network congestion and related issues can cause delays to be skewed. This is illustrated in the following graph.
Graph of Gaussian (Red) and skewed/biased distributions (Blue) for inter-packet arrival times.
The fit is sufficient that we can use this model for predicting the likelihood of specific inter-packet times for use in the design of compensatory mechanisms for games.
In part II of Doing the Jitterbug, we will investigate what effects these issues have on game play, and what techniques can be used to minimize these effects.
Interested readers can find references for further study after the jump break.
Sunday, May 16, 2010
See Your Way Out - Using mathematics and image processing to quickly pathfind
As I outlined in my blog entry Earning Your Pathfinding Merit Badge: How Game Characters Navigate Their Game World, pathfinding (or path finding) is a critical component in PC games.
Pathfinding refers to the techniques used in games to allow Non-Player Characters (NPC) to navigate the game world.
Most games use some variant of Dijkstra's algorithm, a technique pioneered by Dutch computer scientist Edsger Dijkstra. Dijkstra's algorithm produces a shortest path tree for its result by solving the shortest path problem for a graph with non-negative edge length (the distance between the vertices of the graph are never negative). It is one of a myriad of graph search (or graph traversal) algorithms.
Dijkstra's algorithm is often used in routing problems, such as network routing protocols, most notably IS-IS and OSPF (Open Shortest Path First), and transportation map shortest path solutions.
Perhaps the most commonly used variant in games is the A* search algorithm. Instead of using the distance from the start node (vertex) of the graph, this variant chooses nodes based on an estimate of the distance from the start node to the destination node. This estimate is formed by adding the known distance to the start node to a guess of the distance to the destination node. This guess is called the heuristic of the algorithm.
By utilizing such a technique, the A* algorithm provides improved performance when compare to the behavior of Dijkstra's algorithm. If the guess is set to zero, the two algorithms are equivalent. When the guess is made positive but less than the true distance to the goal, A* continues to find optimal paths, but because fewer nodes are examined, performance is improved. When the guess exactly matches the actual distance, A* finds the optimal path and does so while examining the minimal possible number of nodes. If the guess is increased yet more, A* continues to find paths, examining fewer nodes still, but no longer guaranteeing an optimal path. In most games, since it is not practical to guarantee a correct heuristic at all times, this is acceptable due to the increase in algorithm performance.
With extremely complex environments (like the maze we'll be looking at), these algorithms can prove to be impractical, with other solutions preferred. A nice overview of such issues can be seen at Sean Barrett's Path Finding blog entry.
I'm going to go through a technique that is conceptually beautiful to me: Using image processing to solve a maze path finding problem.
I'm sure every reader is familiar with mazes. Perhaps some of you have found yourself lost in a particularly fiendish one, having to ring the nearby "rescue bell'!
I'll be using the superb Matlab system from Mathworks.
From their web site: "MATLAB® is a high-level language and interactive environment that enables you to perform computationally intensive tasks faster than with traditional programming languages such as C, C++, and Fortran."
It most certainly is! Along with the equally superb Mathematica it is the staple of my tool set for doing complex mathematical programming, analysis, graphing, and data manipulation. These two products would be my desert island choices to take with me if I could only have two programming environments.
I'll be using the Matlab image processing package DIPimage, developed by Cris Luengo, Professor in multidimensional image analysis at the Swedish University of Agricultural Sciences and Uppsala University. I've borrowed copiously from his examples. Users of Matlab can obtain a license for DIPimage free of charge (normally a $3200.00 and up cost) for academic, non-commercial use. I highly recommend it!
Let's begin.
We will start with the following simple (no holes, loops, etc.) maze:
We use image processing functions to "label" any found walls and show them with differing colors. In this case, we clearly see the maze comprises two walls:
maze = label(maze);
dipshow(maze*2,'labels')
From this we can easily see the path is the area between the two maze walls. By applying a little mathematical morphology, we can isolate one of the walls:
path = maze==1
We then can do a binary dilation on the wall and fill in any holes (effectively a smearing of the wall):
pathwidth = 5;
path = bdilation(path,pathwidth);
path = fillholes(path)
Next, let us erode this smeared version of the wall. This operation effectively "shrinks" the wall by subtracting components smaller than the erosion radius and in the case of a binary image such as ours by removing the perimeter pixels. We then take the difference of the eroded image and the original image:
path = path - berosion(path,pathwidth);
Finally, overlaying the result with our original maze shows our desired goal:
overlay(inputmaze,path)
I think you'll agree, a most fascinating use of tools to solve a problem from a domain you'd not likely think to be image processing.
The interested reader can find details of techniques such as these in these excellent references:
The Image Processing Handbook by John C. Russ
Digital Image Processing Using MATLAB by Rafael C. Gonzalez
Feature Extraction & Image Processing by Mark Nixon
Happy trails to you!
Pathfinding refers to the techniques used in games to allow Non-Player Characters (NPC) to navigate the game world.
Most games use some variant of Dijkstra's algorithm, a technique pioneered by Dutch computer scientist Edsger Dijkstra. Dijkstra's algorithm produces a shortest path tree for its result by solving the shortest path problem for a graph with non-negative edge length (the distance between the vertices of the graph are never negative). It is one of a myriad of graph search (or graph traversal) algorithms.
Dijkstra's algorithm is often used in routing problems, such as network routing protocols, most notably IS-IS and OSPF (Open Shortest Path First), and transportation map shortest path solutions.
Perhaps the most commonly used variant in games is the A* search algorithm. Instead of using the distance from the start node (vertex) of the graph, this variant chooses nodes based on an estimate of the distance from the start node to the destination node. This estimate is formed by adding the known distance to the start node to a guess of the distance to the destination node. This guess is called the heuristic of the algorithm.
By utilizing such a technique, the A* algorithm provides improved performance when compare to the behavior of Dijkstra's algorithm. If the guess is set to zero, the two algorithms are equivalent. When the guess is made positive but less than the true distance to the goal, A* continues to find optimal paths, but because fewer nodes are examined, performance is improved. When the guess exactly matches the actual distance, A* finds the optimal path and does so while examining the minimal possible number of nodes. If the guess is increased yet more, A* continues to find paths, examining fewer nodes still, but no longer guaranteeing an optimal path. In most games, since it is not practical to guarantee a correct heuristic at all times, this is acceptable due to the increase in algorithm performance.
With extremely complex environments (like the maze we'll be looking at), these algorithms can prove to be impractical, with other solutions preferred. A nice overview of such issues can be seen at Sean Barrett's Path Finding blog entry.
I'm going to go through a technique that is conceptually beautiful to me: Using image processing to solve a maze path finding problem.
I'm sure every reader is familiar with mazes. Perhaps some of you have found yourself lost in a particularly fiendish one, having to ring the nearby "rescue bell'!
I'll be using the superb Matlab system from Mathworks.
From their web site: "MATLAB® is a high-level language and interactive environment that enables you to perform computationally intensive tasks faster than with traditional programming languages such as C, C++, and Fortran."
It most certainly is! Along with the equally superb Mathematica it is the staple of my tool set for doing complex mathematical programming, analysis, graphing, and data manipulation. These two products would be my desert island choices to take with me if I could only have two programming environments.
I'll be using the Matlab image processing package DIPimage, developed by Cris Luengo, Professor in multidimensional image analysis at the Swedish University of Agricultural Sciences and Uppsala University. I've borrowed copiously from his examples. Users of Matlab can obtain a license for DIPimage free of charge (normally a $3200.00 and up cost) for academic, non-commercial use. I highly recommend it!
Let's begin.
We will start with the following simple (no holes, loops, etc.) maze:
Note the entry and exit apertures in the lower right and upper right areas of the outer call of the maze. Try solving it manually if you'd like!
Next, we use our image processing functions to convert our maze to grey-valued, and threshold it to obtain the binary image. The maze walls are now shown in red, with the path areas in black:
maze = colorspace(inputmaze,'grey');
maze = ~threshold(maze)
maze = ~threshold(maze)
We use image processing functions to "label" any found walls and show them with differing colors. In this case, we clearly see the maze comprises two walls:
maze = label(maze);
dipshow(maze*2,'labels')
From this we can easily see the path is the area between the two maze walls. By applying a little mathematical morphology, we can isolate one of the walls:
path = maze==1
We then can do a binary dilation on the wall and fill in any holes (effectively a smearing of the wall):
pathwidth = 5;
path = bdilation(path,pathwidth);
path = fillholes(path)
Next, let us erode this smeared version of the wall. This operation effectively "shrinks" the wall by subtracting components smaller than the erosion radius and in the case of a binary image such as ours by removing the perimeter pixels. We then take the difference of the eroded image and the original image:
path = path - berosion(path,pathwidth);
Finally, overlaying the result with our original maze shows our desired goal:
overlay(inputmaze,path)
I think you'll agree, a most fascinating use of tools to solve a problem from a domain you'd not likely think to be image processing.
The interested reader can find details of techniques such as these in these excellent references:
The Image Processing Handbook by John C. Russ
Digital Image Processing Using MATLAB by Rafael C. Gonzalez
Feature Extraction & Image Processing by Mark Nixon
Happy trails to you!
Thursday, May 13, 2010
I've Got A Stalker: S.T.A.L.K.E.R. Shadow of Chernobyl, that is.
S.T.A.L.K.E.R.: Scavenger, Trespasser, Adventurer, Loner, Killer, Explorer, Robber.
I was blown away by the description and the screen shots shown, the graphics looked amazing compared to any shooter we'd played up to that time. The rumored release of the game was soon, so the juices started flowing for what looked to be a most excellent addition to the game collection.
Unfortunately, the game faced delay after delay, eventually resulting in a ninth place finish in Wired's Vaporware '06 contest. In January 2007, a contest for players to experience the game beta in a marathon session collapsed when the THQ staff (publishers of the game) that had organized the event were themselves unable to obtain copies of the game.
The game finally made its public debut at the end of March, 2007.
By fulfilling mission requests provided by all sorts of NPC in the game and by obtaining or otherwise finding valuables in the game, the player builds a stock of goods that can be sold or traded for items such as food, weaponry, clothing, ammunition, artifacts, etc.
The artifacts in the game play a key role in trading and player protection:

Most anomalies produce visible air or light distortions and their extent can be determined by throwing bolts (of which the player carries an infinite supply) to trigger them. Some stalkers also possess an anomaly detector, which emits warning beeps of a varying frequency depending on their proximity to an anomaly. The guide in the film Stalker, and his predecessors in the Strugatsky brothers' book Roadside Picnic, test various routes before proceeding. In the film, metal nuts tied with strips of cloth are used.
Anomalies produce Artifacts, the valuable scientific curiosities that make the Zone worth exploring monetarily. As well as being traded for money, a number of Artifacts can be worn so that they provide certain benefits and detriments (for example, increasing a stalker's resistance to gunfire while also contaminating him with small amounts of radiation). Artifacts are found scattered throughout the Zone, often near clusters of anomalies."
At first, I found this aspect of the game boring and time consuming. Little did I know I would soon become addicted to the hunt, particularly for the more rare items. I soon came to appreciate this type of game play, common in the MMORPG games such as World of Warcraft that I'd poked fun at.
Game play covers a huge area of many square miles. Originally, the game was to be completely open, but by release, the map had been subdivided into eighteen areas, each reachable through specific passages. Nonetheless, the game play always feels expansive, with superb draw distances.
This, combined with the wide range of choices in interactions and missions (Who do I want to befriend? Who do I decide to trade with? What items do I want for my character?) leads to excellent replay value in the game.
The game is based in the in-house developed X-Ray graphics rendering engine. From the Wikipedia entry:
"The X-ray Engine is a DirectX 8.1/9 Shader Model 3.0 graphics engine. Up to a million polygons can be on-screen at any one time. The engine features HDR rendering, parallax and normal mapping, soft shadows, motion blur, widescreen support, weather effects and day/night cycles. As with other engines that use deferred shading, the X-ray Engine does not support anti-aliasing with dynamic lighting enabled. However, a "fake" form of anti-aliasing can be enabled with the static lighting option; this format utilizes a technique to blur the image to give the false impression of anti-aliasing."
Even in 2010, the graphics hold up well, especially on high-end machines.
The A.I. system was also built in-house. The "ALife" system originally was to have NPC constantly active in the game world, regardless of player interaction. By release, this had been reduced in functionality. Nonetheless, the capabilities are quite robust, as described in the Wikipedia entry:
"GSC Game World's proprietary ALife artificial intelligence engine. ALife supports more than one thousand characters inhabiting the Zone. These characters are non-scripted, meaning that AI life can be developed even when not in contact with the player.
The NPCs have a full life cycle (task accomplishment, combat, rest, feeding and sleep) and the same applies to the many monsters living in the Zone (hunting, attacking stalkers and other monsters, resting, eating, sleeping). These monsters migrate in large groups. The non-scripted nature of the characters means that there are an unlimited number of random quests. For instance, rescuing stalkers from danger, destroying stalker renegades, protecting or attacking stalker camps or searching for treasure. The AI characters travel around the entire zone as they see fit.
Numerous tactics can be employed to complete the game, such as rushing or using stealth and sniping. The NPCs will react in a different way to each of them. S.T.A.L.K.E.R.'s NPCs plan ahead by "Goal-Oriented Action Planning" to achieve this."
S.T.A.L.K.E.R. uses a modified version of the ODE physics engine to provide rag doll physics and accurate bullet ballistics: Bullets are affected by gravity and ricochet off of surfaces.
Adding to the realism is a completely dynamic day / night progression of time, including weather effects such as rain, lightning, showers and sunlight.
Even at an age nearing four years, still a most worthwhile game, both for FPS fans and RPG players. That it is now available on Valve's excellent Steam system for under $20.00 makes this a no-brainer if you have not already played it. Since the version is the latest patch, most of the niggling bugs that were in the initial release have been remedied.
The follow-up games from the same developer, S.T.A.L.K.E.R.: Clear Sky released in September 2008 for North America, and S.T.A.L.K.E.R.: Call of Pripyat released in February 2010 fail to capture the magic of the original in my opinion. The former was an unmitigated disaster, bug ridden and having none of the flavor that made the original so engaging. The latter returned to more of the game play mechanics of its progenitor, and is arguably the least bug plagued of the three. Recommended for players that loved the original, but for others only if it can be had at a bargain price.
All three games in the series have an active "modding" community, providing new maps, characters, game and character abilities, and modifications of difficulty levels. This adds considerably to the game play value and longevity of the game.
Like many things in life, this is a case where the original is the best.
Highly recommended, grade A material.
The Official Unofficial Bigfoot Challenge: Throwing Down The Gauntlet of Science at Marketing BS.
I decided to move this from my blog entry She Blinded Me WIth Science: Tuning Up Your BS Detectors to make it more easily findable by Bigfoot Networks.
Bigfoot: All ribbing aside, I'm quite serious. Let's put your outlandish claims under the harsh light of scientific scrutiny. Win or lose, a school gets a nice PC, so think of it as an educational opportunity combined with worthwhile charity. You can comment here or e-mail me for the contact information for my counsel, where the contract for the challenge can be fine tuned based on the outline in the challenge below and then finalized.
I look forward to your response.
I look forward to your response.
(1)
Bigfoot networks ("Bigfoot") will supply a (2)
I will provide a newly built, high performance gaming PC, configured with components typically used by gamers with budgets, not to exceed $2000.00 in total costs. Certainly a machine far below what the highly compensated and sponsored professionals would have. You know, the pros making all the glowing statements on the Bigfoot web site.
Such a machine would of course be handicapped in performance compared to the machines of a pro gamer, so any benefit from the Bigfoot technologies would be expected to be magnified. The machine will have a fresh, unmolested installation of Windows 7 64 Bit operating system, patched to the current patch and update state at the time of the challenge.
The tester (me) will install the game to be used for the challenge on the day of the challenge. The game will be patched to the patch level available at that time. Any customization of the game other than that available via normal in-game interfaces will be provided to the tester for review and approval, and will be applied by the tester, subject to verification by the
(3)
The
(4)
At each game or session start, the identity of the correct and current NIC in use will be noted and placed in a time stamped envelope by the tester.
(5)
The
Alternately, utilizing accepted data security techniques as outlined by Carnac the Magnificent and in light of Bigofoot's likely concerns of this data being 'leaked', the envelopes can be hermetically sealed, placed in a mayonnaise jar, and at a time no later than noon the following day the aforesaid jar shall be placed on Funk & Wagnalls' porch.
(6)
At no time will any employee of Bigfoot be allowed contact or interaction with the test PC, other than access supervised and approved by the tester to allow Bigfoot to validate the correctness and specifications of the test PC build. The
Specifically, the
Any indicators or other applications that provide notification of NIC or network status, performance, or related information will be disabled or made otherwise unavailable.
(6a)
Any and all non-standard or non-default utilization of the network "stack" or operating system by the Bigfoot "Killer 2100" NIC and associated software will be specified in writing and certified as encompassing any and all such utilization by Bigfoot. Such documentation will be provided to the tester at least three days before the challenge test to allow configuration of the alternate NIC and OS as appropriate. This includes, but is not limited to, changes in use of "Nagling", packet fragmentation or coalescing behaviors, any other interaction with the underlying OS environment and network facilities, any interaction with the OS that affects process priorities or other characteristics that may affect system, network, game or process performance that are advantageous to the Bigfoot NIC and that can also be utilized by the alternate NIC but may not be in the default settings for Windows. Such documentation must include any and all system changes made by the installer(s) for Bigfoot related software components, and any and all system changes made by the utilization of Bigfoot related software components.
(6b)
The tester reserves the right to install and utilize monitoring applications to determine system changes made by the installation of any and all Bigfoot related software during the installations process and during the game play testing. This includes but is not limited to Windows registry, process control, and network object monitoring. In addition, the test reserves the right to utilize packet analysis tools during game play testing to ensure that the comparison between NICs uses packets of similar nature.
(6a) and (6b) ensure that the installation and use of the Bigfoot NIC and related software do not cause covert or overt changes to normal Windows OS, Network, or process behavior that would have performance impact that could also apply to the alternate NIC but may not be used by default in Windows. In other words, they ensure that it is the card and associated driver and only the card and associated driver that is affecting performance, if any such effect exists.
Since it would obviously be trivial for an installer, application, or driver to make such changes covertly (e.g., my installer could manipulate the on-board NIC parameters unfavorably, and/or manipulate the parameters for my "super nic" favorably to produce favorable results) and no prior "test" of the Bigfoot products has to my knowledge eliminated this possibility, we will ensure this for this challenge.
The tester reserves the right to install and utilize network performance monitoring tools to record quantitative data to be included with the primary score and NIC identification data. This data, if gathered, will be statistically analyzed to determine if any statistically significant differences exist between the Bigfoot NIC and the alternate NIC. Only open source monitoring tools will be utilized, such that Bigfoot can assess the correctness of the monitoring tool(s). By the same token, any tools that the tester chooses to utilize that are supplied by Bigfoot must be provided with complete source code and build environments, allowing independent validation and compilation of the tool(s). This ensures that no manipulation of measurement or output data is done by the Bigfoot supplied tools.
(7)
At no time when at the PC will the
The only interactions allowed with the test PC for the
(8)
I and Bigfoot may agree to a 'disinterested' third party to conduct the actual testing, to include but not limited to the switching of the NICs and recording of data.
(9)
The test will be held at a mutually agreed on location with sufficient network bandwidth. The servers for the games will also utilize providers mutually agreed on by Bigfoot and me. For any game, at least ten providers will be specified, half by the tester and half by Bigfoot. The provider to be used for the tests will be selected by the appropriate random selection process by the tester at the time of the test. If it is determined there will be a LAN component to the challenge tests, the PC used for the server must use a standard retail NIC other than Bigfoot products. Any LAN server PC if used will only have a fresh, default Windows 7 installation and the default installation files and configuration for the game. This is to reduce the possibility of collusion with game server providers and to eliminate the possibility of any covert or overt use of communication techniques dependent on any form of endpoint cooperation and only available when Bigfoot products are at both endpoints of the network.
I humbly suggest the Google campus for the test. Sergey and Larry will surely accommodate us, and they both enjoy a good scientific venture. Heck, they might even help us televise it!
(10)
The Bigfoot "Killer 2100" NIC will be a retail model, purchased by the tester, as part of the test PC build.
No information, such as the MAC of the NIC will be made available to Bigfoot until the termination of the test. Both NICs will use a 'spoofed' MAC during game play, provided and controlled by the tester, to ensure that no identification of which NIC is in use can be attempted by packet analysis.
(11)
The data will be independently analyzed by an agency approved by both parties as to statistical significance of the data gathered.
(12)
The results will be published on the home / front page of the Bigfoot Networks web site. They will remain there for a period not less than one hundred eighty (180) days.
Any changes to the domain name or other infrastructure that would prevent access to this information by visitors to the site will provide for the needed changes to the published results to ensure they remain visible.
(13)
This item is not part of the challenge requirements. It's just a place holder for a "Hello!" to the apparently underpaid lawyers of Bigfoot that will surely be asked to review this and figure out a way to have some kind of "take down" issued so Bigfoot doesn't have to explain why they didn't take up the challenge. Hello!
(14)
If there is a statistically significant
(15)
If no statistically significant
(16)
Any violation of the rules, specifications and stipulations agreed upon will constitute a forfeiture by the violator.
Labels:
A Technical Bent,
Funny,
Gaming,
Hardware and Software Related,
Opinion
Taking a Bite from the Poison Apple: Gaming on a MAC.
I like Apple. I like the Macintosh in all its current incarnations, from mini to laptop to traditional deskside form factors (am I allowed to call it Macintosh, or is it just Mac now? I'm not sure what the current cult protocol dictates.)
I like the marketing of the company: it is consistently slick, polished, and clever.
"What do you think of our super cool NeXT hardware?" he asked me. I responded with something like "It's an already outdated steaming turd that is only falling further behind Intel based stuff each day. Dump it. Dump any of the business that's hardware. Put NeXTSTEP on Intel, it's so much better than Windows, you'll rule the world." You'd have thought I'd told the Pope that God was a phony.
By the time NeXT got their act in gear and dropped the hardware facade and made NeXTSTEP available for commodity hardware, it was too late: Windows had evolved and had grown in market share to an insurmountable lead. What a disaster, a ship steered by an ego blinded captain. Not his first flop, nor will it be his last in my opinion. But make no mistake: Jobs could sell $50.00 bags of ice cubes to Eskimos, and have them believing his version tastes better. You've got to admire that.
I like that since hardly anyone uses the Mac product (under 4% of the market), hackers don't waste their time on attacking the machine. I don't like that Apple markets this under the guise of their machine being more secure than Windows machines: security through obscurity is useless, and the fact that Apple hardware is consistently the first to fall at white hat security hacking contests demonstrates this. Nonetheless, in the same way that no city burglar is going to go out of his way and drive a hundred miles into the countryside just to rob you, the platform is much less likely to find itself under attack. For now at least.
I like that the very narrow range of hardware options used (and controlled) by Apple makes life easier for their OS developers. Stuff just works. That Windows works so well with the innumerable combinations of hardware it can be installed on is miraculous, Apple chose to simplify the problem and has done a superb job of it.
I like the support infrastructure of Apple. This is one area that I've yet to see anything even close in the traditional PC world. The Apple store staff knows their stuff. The genius bar really knows their stuff. A user of the product can get face to face, quality technical support for zero or minimal cost, instead of spending hours talking or on-line chatting with tech support via some version of the typical cookie cutter outsourced staff.
All of this boils down to this: The Mac is the machine least likely to be bought by me, and the most likely to be recommended by me. Except to serious gamers. Allow me to explain.
When friends come to me seeking advice for a PC (I'll use the term generically to mean both traditional hardware and that of the Apple persuasion), I ask them some pretty simple questions.
If the answers indicate that they do not have a need for the latest and greatest hardware, add-in cards, etc. I usually point them to Apple. The machines are easy to learn and use for the novice. They tend to "just work" due to Apple's vice-like grip on the narrow range of hardware components allowed in the machine. The support infrastructure Apple provides means if I'm not around to answer a question, a quick phone call to Apple or a visit to the local store will usually result in rapid resolution. Keeps them off my back.
But for the serious gamer? Well, as Apollo 13 hero Jim Lovell said, "I believe we've had a problem here."
There are a few things standing in the way of the gamer using a Mac that wants state of the art hardware for maximum performance with modern games.
Firstly, excepting the deskside Mac Pro models, there is no real means to update the anemic graphics hardware in the Apple machines. Some of the higher-end MacBook models are capable of running games with acceptable frame rates, but the really sophisticated bleeding edge titles are off-limits if acceptable performance is expected.
Even with the Mac Pro, graphics hardware options are severely limited if the user wants to retain the full range of Apple specific functionality and sensibilities in the interface from the time of powering up the machine: the cards used by Apple require proprietary firmware (for EFI and the OS), meaning non-certified cards will not function properly in OSX, nor will proper boot screen functionality be retained.
This means the user is limited to Apple specific cards if they wish to retain these capabilities and functionality, and these cards tend to severely lag behind the current state of the art as far as performance capabilities. By way of example, the fastest card at the time of writing on the Apple Mac Pro web page is the ATI Radeon HD 4870, a card released two years ago.While there are some third-party cards of higher specification available, these too are at least a generation behind the state of the art. And of course, either solution carries the burden of the "Apple tax": you will pay more for the same card compared to the PC version.
It is possible to do what is effectively "brain surgery" on more modern cards via firmware manipulation to enable use in a Mac Pro, but the likelihood of reduced functionality and performance or of producing an expensive paperweight by such antics far outweighs the benefits. See the entries at everymac.com and the Expansion Cards and Hardware Compatibiltiy sections of the Wikipedia entry for the Mac Pro for a glimpse into the grief faced by users that need more GPU horsepower in the Mac environment.
Yet even then, the Mac user is boxed in: the latest high performance GPUs are quite power hungry. One may tax the power supply of the Mac Pro, dual cards (SLI or Crossfire) would be out of the question without cobbling some sort of Frankenstein power supply to supplement or supplant the one that comes in the machine.
Secondly, the Mac gamer is faced with the reality that mirrors the disinterest in the Mac by hackers: By and large, game developers don't give a hoot about the Apple environment. The Apple store lists 70 titles. Total. A tiny handful of FPS games.
This means that the Mac owner, if they want to play most any current game, will need to do so using Microsoft Windows. No need to rush out and buy a PC to supplant your Mac because it can't do something you want, however. There are a few ways a Mac owner can play Windows games utilizing their Mac hardware. We'll outline these here.
For simple games (2D, scrollers, etc.) with lightweight graphics performance requirements. a virtual machine environment such as Parallels Desktop or Vmware Fusion will allow the user to install Microsoft Windows and the games they desire into a Virtual Machine. This effectively creates a "computer in the computer", and for simple games will allow the user to play the game without leaving the OSX environment. My own experiments show reasonable performance on a higher-end Mac Pro, so long as the game's graphical requirements are kept reasonable. For games with more rigorous requirements, the performance in a virtual environment severely lags behind that of running on native hardware.
For these kinds of games, the user will need to install Windows in a fashion that allows for native booting. This can be accomplished with Apple's own Boot Camp or in a more flexible but more involved implementation using rEFIt.
Boot Camp provides a trivially simple mechanism for the gamer to get up and running on Windows games on their Mac hardware. The "Boot Camp Assistant" of the installer will walk the user through automatic repartitioning of the disk and installation of Windows. The current OSX install discs contain the hardware drivers for Windows components of the user's Mac, simplifying the installation and configuration of Windows considerably: no need to ferret these out from the web. The details and guides for using Boot Camp can be found at the Apple support page for the product.
rEFIt is an open source EFI boot loader and toolkit for the Mac. Unlike Boot Camp, which is limited to one alternate installation of Windows alongside OSX, rEFIt gives the user the functionality of a traditional boot manager, allowing the installation of multiple OS systems. On power up, the user is presented with a boot menu showing what operating systems are available for selection.
I use rEFIt to run Windows, Linux and OSX. The installation of rEFIt does not have the typical Apple hand holding: it's not an Apple product, after all. That said, the installation and configuration are fairly trivial, and any gamer that has built their own machine should have no trouble getting up and running under rEFIt. The canonical source for installation and configuration is the Sourceforge page for rEFIt.
For either the Boot Camp or rEFIt solutions, I would recommend the gamer determine the precise hardware configuration of their Mac and acquire the latest Windows hardware drivers from the appropriate sources before starting the OS installation process. Often only the most current drivers will provide the desired gaming experience for the newest games (graphics card drivers being particularly fickle.). At the very least, ensure that you have the needed network (NIC) driver, so that once Windows is installed and booted, you can retrieve other needed drivers from the Internet.
You'll also want to get your hands on a decent keyboard and mouse. While the Apple realizations exude a Bang & Olufsen elegance, they're utterly useless for any kind of real gaming.
See you on the battlefields!
I like the marketing of the company: it is consistently slick, polished, and clever.
I like the superiority complex that emanates from Apple's cultish leader Steve Jobs and that seems to ooze out of the corporation and infect many of the consumers of the products. It gives my daughter and me endless fun to watch the iMonkeys at the Apple stores work their conversion magic on prospective cult members, like Kaa working Mowgli (for some reason, things like this seem funnier to me in German.)
I like Steve. Steve Wozniak, that is. Without him, there would be no Apple. A hilarious guy, as down to earth as they get. And brilliant. I like the Jobs version of Steve too, but he couldn't code his way out of a paper bag, and I just have no interest in having a coffee and shooting the breeze with him like I do with "The Woz". Woz has donated huge sums of money and technology to local schools. Education for our children is important to him, and I admire that deeply. He dated Kathy Griffin. That must have been a hoot.
I've enjoyed the rage tantrums of the Jobs incarnation of Steve since my own personal experience with one twenty some years ago when he asked me the wrong question.

"What do you think of our super cool NeXT hardware?" he asked me. I responded with something like "It's an already outdated steaming turd that is only falling further behind Intel based stuff each day. Dump it. Dump any of the business that's hardware. Put NeXTSTEP on Intel, it's so much better than Windows, you'll rule the world." You'd have thought I'd told the Pope that God was a phony.
By the time NeXT got their act in gear and dropped the hardware facade and made NeXTSTEP available for commodity hardware, it was too late: Windows had evolved and had grown in market share to an insurmountable lead. What a disaster, a ship steered by an ego blinded captain. Not his first flop, nor will it be his last in my opinion. But make no mistake: Jobs could sell $50.00 bags of ice cubes to Eskimos, and have them believing his version tastes better. You've got to admire that.
I like that since hardly anyone uses the Mac product (under 4% of the market), hackers don't waste their time on attacking the machine. I don't like that Apple markets this under the guise of their machine being more secure than Windows machines: security through obscurity is useless, and the fact that Apple hardware is consistently the first to fall at white hat security hacking contests demonstrates this. Nonetheless, in the same way that no city burglar is going to go out of his way and drive a hundred miles into the countryside just to rob you, the platform is much less likely to find itself under attack. For now at least.
I like that the very narrow range of hardware options used (and controlled) by Apple makes life easier for their OS developers. Stuff just works. That Windows works so well with the innumerable combinations of hardware it can be installed on is miraculous, Apple chose to simplify the problem and has done a superb job of it.
I like the support infrastructure of Apple. This is one area that I've yet to see anything even close in the traditional PC world. The Apple store staff knows their stuff. The genius bar really knows their stuff. A user of the product can get face to face, quality technical support for zero or minimal cost, instead of spending hours talking or on-line chatting with tech support via some version of the typical cookie cutter outsourced staff.
All of this boils down to this: The Mac is the machine least likely to be bought by me, and the most likely to be recommended by me. Except to serious gamers. Allow me to explain.
When friends come to me seeking advice for a PC (I'll use the term generically to mean both traditional hardware and that of the Apple persuasion), I ask them some pretty simple questions.
If the answers indicate that they do not have a need for the latest and greatest hardware, add-in cards, etc. I usually point them to Apple. The machines are easy to learn and use for the novice. They tend to "just work" due to Apple's vice-like grip on the narrow range of hardware components allowed in the machine. The support infrastructure Apple provides means if I'm not around to answer a question, a quick phone call to Apple or a visit to the local store will usually result in rapid resolution. Keeps them off my back.
But for the serious gamer? Well, as Apollo 13 hero Jim Lovell said, "I believe we've had a problem here."
There are a few things standing in the way of the gamer using a Mac that wants state of the art hardware for maximum performance with modern games.
Firstly, excepting the deskside Mac Pro models, there is no real means to update the anemic graphics hardware in the Apple machines. Some of the higher-end MacBook models are capable of running games with acceptable frame rates, but the really sophisticated bleeding edge titles are off-limits if acceptable performance is expected.
Even with the Mac Pro, graphics hardware options are severely limited if the user wants to retain the full range of Apple specific functionality and sensibilities in the interface from the time of powering up the machine: the cards used by Apple require proprietary firmware (for EFI and the OS), meaning non-certified cards will not function properly in OSX, nor will proper boot screen functionality be retained.
This means the user is limited to Apple specific cards if they wish to retain these capabilities and functionality, and these cards tend to severely lag behind the current state of the art as far as performance capabilities. By way of example, the fastest card at the time of writing on the Apple Mac Pro web page is the ATI Radeon HD 4870, a card released two years ago.While there are some third-party cards of higher specification available, these too are at least a generation behind the state of the art. And of course, either solution carries the burden of the "Apple tax": you will pay more for the same card compared to the PC version.
It is possible to do what is effectively "brain surgery" on more modern cards via firmware manipulation to enable use in a Mac Pro, but the likelihood of reduced functionality and performance or of producing an expensive paperweight by such antics far outweighs the benefits. See the entries at everymac.com and the Expansion Cards and Hardware Compatibiltiy sections of the Wikipedia entry for the Mac Pro for a glimpse into the grief faced by users that need more GPU horsepower in the Mac environment.
Yet even then, the Mac user is boxed in: the latest high performance GPUs are quite power hungry. One may tax the power supply of the Mac Pro, dual cards (SLI or Crossfire) would be out of the question without cobbling some sort of Frankenstein power supply to supplement or supplant the one that comes in the machine.
Secondly, the Mac gamer is faced with the reality that mirrors the disinterest in the Mac by hackers: By and large, game developers don't give a hoot about the Apple environment. The Apple store lists 70 titles. Total. A tiny handful of FPS games.
This means that the Mac owner, if they want to play most any current game, will need to do so using Microsoft Windows. No need to rush out and buy a PC to supplant your Mac because it can't do something you want, however. There are a few ways a Mac owner can play Windows games utilizing their Mac hardware. We'll outline these here.
For simple games (2D, scrollers, etc.) with lightweight graphics performance requirements. a virtual machine environment such as Parallels Desktop or Vmware Fusion will allow the user to install Microsoft Windows and the games they desire into a Virtual Machine. This effectively creates a "computer in the computer", and for simple games will allow the user to play the game without leaving the OSX environment. My own experiments show reasonable performance on a higher-end Mac Pro, so long as the game's graphical requirements are kept reasonable. For games with more rigorous requirements, the performance in a virtual environment severely lags behind that of running on native hardware.
For these kinds of games, the user will need to install Windows in a fashion that allows for native booting. This can be accomplished with Apple's own Boot Camp or in a more flexible but more involved implementation using rEFIt.
Boot Camp provides a trivially simple mechanism for the gamer to get up and running on Windows games on their Mac hardware. The "Boot Camp Assistant" of the installer will walk the user through automatic repartitioning of the disk and installation of Windows. The current OSX install discs contain the hardware drivers for Windows components of the user's Mac, simplifying the installation and configuration of Windows considerably: no need to ferret these out from the web. The details and guides for using Boot Camp can be found at the Apple support page for the product.
rEFIt is an open source EFI boot loader and toolkit for the Mac. Unlike Boot Camp, which is limited to one alternate installation of Windows alongside OSX, rEFIt gives the user the functionality of a traditional boot manager, allowing the installation of multiple OS systems. On power up, the user is presented with a boot menu showing what operating systems are available for selection.
rEFIt Boot Screen
For either the Boot Camp or rEFIt solutions, I would recommend the gamer determine the precise hardware configuration of their Mac and acquire the latest Windows hardware drivers from the appropriate sources before starting the OS installation process. Often only the most current drivers will provide the desired gaming experience for the newest games (graphics card drivers being particularly fickle.). At the very least, ensure that you have the needed network (NIC) driver, so that once Windows is installed and booted, you can retrieve other needed drivers from the Internet.
You'll also want to get your hands on a decent keyboard and mouse. While the Apple realizations exude a Bang & Olufsen elegance, they're utterly useless for any kind of real gaming.
See you on the battlefields!
Earning Your Pathfinding Merit Badge: How Game Characters Navigate Their Game World. A µ post.
As most gamers probably have, I've pondered what mechanism must games use to allow the NPCs in the game to find their way around the game world. Why did games like F.E.A.R. and Halo seem to exhibit more realistic NPC movements?
I was exposed more deeply to the problem when I embarked on building a simple FPS game from scratch including AI, rendering, and netcode. I'd originally planned on a game that mirrored my neighborhood so my friends and I could run around familiar territory and maim each other. It ended up being a shoot-the-aliens sci-fi game: I'm no artist, and it was easier for me to make characters that didn't look like pale imitations of humans.
I'd pulled out all my references in preparation for writing a blog entry outlining the basics of the techniques used for pathfinding in games, when I happened upon a most excellent overview written by a game developer.
They do a much better job than I think I could have, you can view the overview at Fixing Pathfinding Once and For All at the Game/AI blog. Nice graphical explanations, with videos demonstrating the differences in pathfinding strategies.
Other useful background information can be found at A* Pathfinding for Beginners and a short entry at Wikipedia.
My own personal references are the superb four volumes of AI Game Programming Wisdom, and the equally excellent Game Programming Gems, now up to eight volumes with the most recent 2010 release. I was heavily influenced by reading the more academic treatments found in both Artificial Intelligence for Games by Ian Millington by and Behavioral Mathematics for Game AI by Dave Mark.
If you have any interest in how realistic character movement is accomplished in games, I highly recommend taking a look at the links, and if you decide to program your own, I can't think of better reference works than the book series I've used.
I was exposed more deeply to the problem when I embarked on building a simple FPS game from scratch including AI, rendering, and netcode. I'd originally planned on a game that mirrored my neighborhood so my friends and I could run around familiar territory and maim each other. It ended up being a shoot-the-aliens sci-fi game: I'm no artist, and it was easier for me to make characters that didn't look like pale imitations of humans.
I'd pulled out all my references in preparation for writing a blog entry outlining the basics of the techniques used for pathfinding in games, when I happened upon a most excellent overview written by a game developer.
They do a much better job than I think I could have, you can view the overview at Fixing Pathfinding Once and For All at the Game/AI blog. Nice graphical explanations, with videos demonstrating the differences in pathfinding strategies.
Other useful background information can be found at A* Pathfinding for Beginners and a short entry at Wikipedia.
My own personal references are the superb four volumes of AI Game Programming Wisdom, and the equally excellent Game Programming Gems, now up to eight volumes with the most recent 2010 release. I was heavily influenced by reading the more academic treatments found in both Artificial Intelligence for Games by Ian Millington by and Behavioral Mathematics for Game AI by Dave Mark.
If you have any interest in how realistic character movement is accomplished in games, I highly recommend taking a look at the links, and if you decide to program your own, I can't think of better reference works than the book series I've used.
Tuesday, May 11, 2010
She Blinded Me WIth Science: Tuning Up Your BS Detectors.
PCs, operating systems, specialized hardware, 'tweaks'. Just some of the pieces in the FPS gamer's world. Like any field with such esoterica, there's plenty of snake oil to go around.
Vendors of software tempt the player with promises of improved game performance, while hardware is presented as being able to improve your pings. Mice are made with DPI ratings that need to use scientific notation. The list goes on and on.
What is the intelligent FPS gamer to make of these, and how can they be sure that their money is well spent on things that really can make a difference in their gaming acumen?
We'll tear into a couple of examples in this blog entry, and show how a healthy dose of skepticism toward many of the claims bantered about the Internet by producers of these products and posters in forums could save the gamer time and money.
I've chosen two areas that have in particular pegged my BS detector.
What? How exactly does this piece of free software know how to manage these things better than Microsoft's own OS? Each of the techniques used have been shown to have little if any merit.
It has been shown repeatedly that file fragmentation has little bearing on the performance of Windows and applications for any modern version of Windows. In the days of the FAT file system, defragmentation could make a noticeable difference, but that was because of a file system design that had outgrown its usefulness.
With modern file systems on Windows (NTFS), defragmentation is frankly a waste of time, and its biggest effect is likely physical wear and tear on your hard drive. Defragmentation is strictly a no-no for Solid State drives, and garners absolutely no benefit.
I am not saying defragmentation of the file system under certain corner cases cannot show some measurable change. It can. I am saying such a change will have no material impact in the performance of a game. I invite any reader that can show otherwise with a repeatable test that will pass muster for scientific validity to comment with references to such a test. I know of none.
The 'shutting down of background processes', whether done manually or with some program has long been a 'tweak' recommended in enthusiast forums. In particular, recommendations are made to disable Windows intrinsic services and processes. Utter BS! There is again no test I've ever seen that shows this to have any material effect on game performance. Messing with Windows facilities can have very deleterious effects on the performance and reliability of the OS, and should be considered off limits by any informed gamer.
Now doing this can certainly affect boot times: these Windows components are typically loaded during the boot processes, so this will take some time obviously. But once booted, Windows manages these components quite efficiently, moving them aside if an application such as a game needs resources. There is no material penalty leaving these alone for a gamer.
Of course, if the gamer has tons of junkware or other software installed on their gaming machine, these may cause issues. In these cases, shutting down or disabling these could benefit game play. But seriously, what sane gamer that cares about game performance plays on a machine burdened with junk?
The solution to problems like this is to have a proper gaming environment, and not install crud in the first place. See Swimming in the Septic Tank with my Gaming Buddies for how I set up my gaming machines, a model that I believe is ideal for the serious gamer.
I'm not even sure how to comment on the claim of 'cleaning RAM, and intensifying processor performance'. Windows manages memory. The way it should be managed. 'Memory Cleaners' and 'Extenders' have long been known to belong to the snake oil camp of PC 'tweaks'. Enough said.
Let's answer our three questions for this product.
The marketing materials and hype that surrounded this initial product were mind numbing. Oral Roberts, Tony Robbins, and Billy Mays combined. There was (and still is) however at the same time a suspicious lack of any tests with reasonable protocols and environments showing any material improvement. This kind of hype should always raise an eyebrow. Their current campaign is filled with outlandish testimonials (how many of these players paid for their cards, you might ask) that they remind me of a revival meeting. No scientifically testable numbers to be found anywhere. Big surprise.
The tests I've seen, both in-house and out, seemed to all have been done on hardware that was borderline at best for a serious gamer. While offloading the network processing from an overloaded OS and hardware platform with a mediocre on-board NIC chip may demonstrate measurable performance benefits in network response and even frame rates due to reduced CPU utilization, no gamer with the $300.00 to spend at the time was likely to be running games on such hobbled hardware. And if they were, the money could have been spent far more effectively elsewhere to improve hardware performance (RAM, CPU, add-in NIC, etc.)
I am not aware of any rigorous, scientifically valid test of these products on a modern gaming machine that show any material effect on performance in areas that matter to the gamer. Reviews by magazines and tech sites reflect this view.
In their most recent incarnation, the company has moved the performance metric to a measurement of their own creation, using tools defined and built by them. "Trust me. See the difference it makes in the flow rate of the Flux Capacitors?'' This is starting to smell like some of the marketing done in other hobbies with kook esoterica, like high-end audio products.
Let's look at our BS detector questions for this item:
If the gamer applies these three simple questions when considering a new piece of hardware, software, or the application of some 'twaek' seen in a forum, I think they will save themselves time and money, and avoid going down the rabbit hole of nonsense.
I try to look at the PC and gaming world, and the world in general, through glasses colored by logic and rationality. Take a look at my thoughts on other areas where I think the manure is deep at I Can See CXLVI Frames Per Second!, I Read it in a National Enquirer Survey, it Must Be True!, Mutation on the Bounty, and Port Forwarding: Slaying the Mythical Dragon of Online PC Gaming.
For a great overview of how to think using logic, rationality, and healthy skepticism, check out the materials at The James Randi Educational Foundation and their Million Dollar Challenge, most recently applied to ultra-expensive high-end speaker cables (well, almost applied: the cable vendor chickened out!)
There's also an amusing snippet How to be a Skeptic on WikiHow, check it out!
Update:
After reviewing some of the grotesque marketing tactics at the pages of Bigfoot Networks, I have made a public challenge.
I invite readers to e-mail them with a link (I wanted to put a cool "mailto" link here, but strangely the contact information for the company headquarters doesn't have an e-mail. For that matter, it doesn't have a phone number. The "Texas Office" has a phone number, but I'm not clear if the address has a suite number or a self-storage shed number. In any case, no e-mail there either. If a reader finds one, let me know! I want to donate a PC to a worthy school.)
Their lawyers (must be pretty cheap, if "Grubby" the master Warcraft player they use as a reference "earns more in a year than your average lawyer.") can contact me and my lawyers for details.
In the spirit of my earlier "Bounty" blog entries Mutation on the Bounty and I Can See CXLVI Frames Per Second!, I will open this up to a public challenge at some future date. I first want to give the Bigfoots at Bigfoot a chance to accept it.
"Grubby". That's a good one. I wonder how much pings really matter for flashing pink hooves on the royal mount, or whatever they call them in those kind of games like Warcraft where cat-like reactions are required. Not!
The challenge has beeen moved to its own entry here.
As an aside, as per the earlier bounties, I'll not be posting any anecdotal comments from readers claiming they've "seen" this particular bigfoot. If you think you can meet this challenge and are able to demonstrate the ability and willingness to suffer the penalty of a loss, or if you have an idea for a challenge related to this that you want to propose, do so via a comment or email. Again, I'll not be posting any of the "Well, I can tell the difference 'cause my cat says so" genre of comments.
Vendors of software tempt the player with promises of improved game performance, while hardware is presented as being able to improve your pings. Mice are made with DPI ratings that need to use scientific notation. The list goes on and on.
What is the intelligent FPS gamer to make of these, and how can they be sure that their money is well spent on things that really can make a difference in their gaming acumen?
We'll tear into a couple of examples in this blog entry, and show how a healthy dose of skepticism toward many of the claims bantered about the Internet by producers of these products and posters in forums could save the gamer time and money.
I've chosen two areas that have in particular pegged my BS detector.
- Game Booster, and other software that alleges to 'speed up' your PC for gaming.
- Specialized 'gaming' Network Cards, that make claims of reduced pings and latency.
- Is there a valid reason this should improve my gaming experience?
- Is there a repeatable, scientific test and measurement to validate the claims, or are they largely anecdotal and subjective?
- Even if there is a measurable difference, does it make a difference for gaming?
What? How exactly does this piece of free software know how to manage these things better than Microsoft's own OS? Each of the techniques used have been shown to have little if any merit.
It has been shown repeatedly that file fragmentation has little bearing on the performance of Windows and applications for any modern version of Windows. In the days of the FAT file system, defragmentation could make a noticeable difference, but that was because of a file system design that had outgrown its usefulness.
With modern file systems on Windows (NTFS), defragmentation is frankly a waste of time, and its biggest effect is likely physical wear and tear on your hard drive. Defragmentation is strictly a no-no for Solid State drives, and garners absolutely no benefit.
I am not saying defragmentation of the file system under certain corner cases cannot show some measurable change. It can. I am saying such a change will have no material impact in the performance of a game. I invite any reader that can show otherwise with a repeatable test that will pass muster for scientific validity to comment with references to such a test. I know of none.
The 'shutting down of background processes', whether done manually or with some program has long been a 'tweak' recommended in enthusiast forums. In particular, recommendations are made to disable Windows intrinsic services and processes. Utter BS! There is again no test I've ever seen that shows this to have any material effect on game performance. Messing with Windows facilities can have very deleterious effects on the performance and reliability of the OS, and should be considered off limits by any informed gamer.
Now doing this can certainly affect boot times: these Windows components are typically loaded during the boot processes, so this will take some time obviously. But once booted, Windows manages these components quite efficiently, moving them aside if an application such as a game needs resources. There is no material penalty leaving these alone for a gamer.
Of course, if the gamer has tons of junkware or other software installed on their gaming machine, these may cause issues. In these cases, shutting down or disabling these could benefit game play. But seriously, what sane gamer that cares about game performance plays on a machine burdened with junk?
The solution to problems like this is to have a proper gaming environment, and not install crud in the first place. See Swimming in the Septic Tank with my Gaming Buddies for how I set up my gaming machines, a model that I believe is ideal for the serious gamer.
I'm not even sure how to comment on the claim of 'cleaning RAM, and intensifying processor performance'. Windows manages memory. The way it should be managed. 'Memory Cleaners' and 'Extenders' have long been known to belong to the snake oil camp of PC 'tweaks'. Enough said.
Let's answer our three questions for this product.
- There's no valid reason this product should have any material affect for a properly configured gaming environment. If you have junk software baggage getting in the way of your gaming pleasure, uninstall it or just build a proper environment in the first place.
- I've seen no repeatable, scientifically valid test to validate the claims for this product when installed in a proper gaming environment. Only anecdotal claims, about as valid as the miracle cures claimed for colon cleansing.
- Since there does not appear to be a measurable difference on a properly setup PC, I'd expect the product to have no material effect. So it doesn't matter to the gamer.
The marketing materials and hype that surrounded this initial product were mind numbing. Oral Roberts, Tony Robbins, and Billy Mays combined. There was (and still is) however at the same time a suspicious lack of any tests with reasonable protocols and environments showing any material improvement. This kind of hype should always raise an eyebrow. Their current campaign is filled with outlandish testimonials (how many of these players paid for their cards, you might ask) that they remind me of a revival meeting. No scientifically testable numbers to be found anywhere. Big surprise.
The tests I've seen, both in-house and out, seemed to all have been done on hardware that was borderline at best for a serious gamer. While offloading the network processing from an overloaded OS and hardware platform with a mediocre on-board NIC chip may demonstrate measurable performance benefits in network response and even frame rates due to reduced CPU utilization, no gamer with the $300.00 to spend at the time was likely to be running games on such hobbled hardware. And if they were, the money could have been spent far more effectively elsewhere to improve hardware performance (RAM, CPU, add-in NIC, etc.)
I am not aware of any rigorous, scientifically valid test of these products on a modern gaming machine that show any material effect on performance in areas that matter to the gamer. Reviews by magazines and tech sites reflect this view.
In their most recent incarnation, the company has moved the performance metric to a measurement of their own creation, using tools defined and built by them. "Trust me. See the difference it makes in the flow rate of the Flux Capacitors?'' This is starting to smell like some of the marketing done in other hobbies with kook esoterica, like high-end audio products.
Let's look at our BS detector questions for this item:
- Pings are important to gamers. They're also largely out of the gamer's control. Once the gamer's packets are on the WAN, there's nothing they can do to improve pings. Probably the reason the company dropped this as their marketing silver bullet, and moved to new nonsense. There's no valid reason improvements in the new metric should result in a material improvement of the gaming experience.
- The only test with any sort of measurement for the current product ("Killer 2100") extant at this time is the in-house test of the in-house created metric using the in-house created tool. There have been no scientifically valid tests showing any material benefit to a gamer. Suspiciously, there are absolutely no details in the marketing diarrhea found on the company site describing the hardware configuration used for their comparison. Lots of "testimonials" though. Sound like a late night TV commercial to you?
- Even if the results of the in-house test are valid, that this would make any material difference to game play strains credulity. Like audio cable makers that charge tens of thousands of dollars for a pair of speaker cables, claiming the incredible improvements wrought by the 10 MHz bandwidth of their cable. Problem is, human hearing drops out orders of magnitude lower in the spectrum, and there has never been a scientifically valid test to show any benefit of such cables. The numbers look good in their own tests for the NIC, as they do for the speaker cables. They just don't matter.
If the gamer applies these three simple questions when considering a new piece of hardware, software, or the application of some 'twaek' seen in a forum, I think they will save themselves time and money, and avoid going down the rabbit hole of nonsense.
I try to look at the PC and gaming world, and the world in general, through glasses colored by logic and rationality. Take a look at my thoughts on other areas where I think the manure is deep at I Can See CXLVI Frames Per Second!, I Read it in a National Enquirer Survey, it Must Be True!, Mutation on the Bounty, and Port Forwarding: Slaying the Mythical Dragon of Online PC Gaming.
For a great overview of how to think using logic, rationality, and healthy skepticism, check out the materials at The James Randi Educational Foundation and their Million Dollar Challenge, most recently applied to ultra-expensive high-end speaker cables (well, almost applied: the cable vendor chickened out!)
There's also an amusing snippet How to be a Skeptic on WikiHow, check it out!
Update:
After reviewing some of the grotesque marketing tactics at the pages of Bigfoot Networks, I have made a public challenge.
I invite readers to e-mail them with a link (I wanted to put a cool "mailto" link here, but strangely the contact information for the company headquarters doesn't have an e-mail. For that matter, it doesn't have a phone number. The "Texas Office" has a phone number, but I'm not clear if the address has a suite number or a self-storage shed number. In any case, no e-mail there either. If a reader finds one, let me know! I want to donate a PC to a worthy school.)
Their lawyers (must be pretty cheap, if "Grubby" the master Warcraft player they use as a reference "earns more in a year than your average lawyer.") can contact me and my lawyers for details.
In the spirit of my earlier "Bounty" blog entries Mutation on the Bounty and I Can See CXLVI Frames Per Second!, I will open this up to a public challenge at some future date. I first want to give the Bigfoots at Bigfoot a chance to accept it.
"Grubby". That's a good one. I wonder how much pings really matter for flashing pink hooves on the royal mount, or whatever they call them in those kind of games like Warcraft where cat-like reactions are required. Not!
The challenge has beeen moved to its own entry here.
As an aside, as per the earlier bounties, I'll not be posting any anecdotal comments from readers claiming they've "seen" this particular bigfoot. If you think you can meet this challenge and are able to demonstrate the ability and willingness to suffer the penalty of a loss, or if you have an idea for a challenge related to this that you want to propose, do so via a comment or email. Again, I'll not be posting any of the "Well, I can tell the difference 'cause my cat says so" genre of comments.
Monday, May 10, 2010
Perplexed Execs Dissect PhysX.
PhysX, the accelerated game physics technology from the 2002 upstart Ageia, has officially been taken off of life support by Nvidia, the GPU behemoth that acquired Ageia in early 2008. While the PhysX API continues to be actively supported and developed, the actual acceleration will only be supported by utilizing the GPGPU capabilities of the Nvidia graphics cards moving forward. The dedicated specialized add-in cards, first produced by Ageia and later licensed by other companies such as BFG, ASUS, etc. are effectively dead.
This was in my mind a case of the light at the end of the tunnel turning out to be an oncoming train. As I predicted in the HardOCP forums during the early days of Aegia, this was a dead man walking. GPGPU was at the time already coming into its own, and it was patently clear that graphics cards would be able to do the same types of calculations as the pricey Aegia add-in card on hardware already owned by the gamer. Perhaps not as rapidly at the time, but just as Apple found themselves hobbled by the glacial rate of performance progress for the PowerPC CPU they used in the past compared to Intel's rapid pace, it was clear that the performance progress of GPU hardware would rapidly surpass that of the custom hardware dictated by Ageia.
Owners of Nvidia GPU hardware can continue to enjoy the benefits of the PhysX technology, but owners of the various dedicated add-in cards now find themselves with expensive paperweights, unless they choose to never update their Nvidia drivers.
Accelerated game physics remains the red headed stepchild of the gaming world. The list of games that utilize the proprietary API is rather limited, with developers more likely to use their own physics technology or one of the middleware packages such as Havok.
Long term, Microsoft will surely incorporate a game physics component into DirectX with abstractions similar to the audio and graphics components. This will allow game developers to utilize a common API that shields them from the vagaries of the underlying physics 'engine', be it GPGPU based (Nvida or ATI), or middleware based (Havok, ODE, etc.)
Gamers will see better and better implementations of accelerated, complex physics in their games for certain, but we've really only seen baby steps up until now.
My prediction for the next piece of dedicated hardware that will fall by the wayside? The ridiculous voodoo-ware by Bigfoot Networks, producers of the ludicrous 'Killer' network interface card series.
One look at their marketing hype should raise red flags for any intelligent gamer. I've yet to see any of their 'tests' that are plastered all around their site contain any details of the protocols and systems involved, giving them more of the appearance of some late night infomercial for Miracle Magnetic Healing Shoe Inserts than that of a truly valid scientific test.
Surely, hardware that is about as useful to a gamer on a modern high performance gaming PC as a diamond encrusted platinum swastika is to a rabbi. I'm sure they'll find some special customers for these products, just like the makers of overpriced ($55,000.00 a pair!) speaker cables do, even though it has been shown audio kooks couldn't discern the difference between expensive cables and wire coat hangers.
I hope my favorite 'Minimum BS' PC magazine gets their hands on a recent model and puts it through the wringer. PC gaming enthusiasts have a myriad of better things to spend their hard earned cash on.
Heck, I may go buy one myself and put and end to this nonsense with a controlled scientific test.
Z9$^&$^03H#$^90efjdv28e0^#%$*( dR - Oops, sorry, I just slipped on a puddle of snake oil...
This was in my mind a case of the light at the end of the tunnel turning out to be an oncoming train. As I predicted in the HardOCP forums during the early days of Aegia, this was a dead man walking. GPGPU was at the time already coming into its own, and it was patently clear that graphics cards would be able to do the same types of calculations as the pricey Aegia add-in card on hardware already owned by the gamer. Perhaps not as rapidly at the time, but just as Apple found themselves hobbled by the glacial rate of performance progress for the PowerPC CPU they used in the past compared to Intel's rapid pace, it was clear that the performance progress of GPU hardware would rapidly surpass that of the custom hardware dictated by Ageia.
Owners of Nvidia GPU hardware can continue to enjoy the benefits of the PhysX technology, but owners of the various dedicated add-in cards now find themselves with expensive paperweights, unless they choose to never update their Nvidia drivers.
Accelerated game physics remains the red headed stepchild of the gaming world. The list of games that utilize the proprietary API is rather limited, with developers more likely to use their own physics technology or one of the middleware packages such as Havok.
Long term, Microsoft will surely incorporate a game physics component into DirectX with abstractions similar to the audio and graphics components. This will allow game developers to utilize a common API that shields them from the vagaries of the underlying physics 'engine', be it GPGPU based (Nvida or ATI), or middleware based (Havok, ODE, etc.)
Gamers will see better and better implementations of accelerated, complex physics in their games for certain, but we've really only seen baby steps up until now.
My prediction for the next piece of dedicated hardware that will fall by the wayside? The ridiculous voodoo-ware by Bigfoot Networks, producers of the ludicrous 'Killer' network interface card series.
One look at their marketing hype should raise red flags for any intelligent gamer. I've yet to see any of their 'tests' that are plastered all around their site contain any details of the protocols and systems involved, giving them more of the appearance of some late night infomercial for Miracle Magnetic Healing Shoe Inserts than that of a truly valid scientific test.
Surely, hardware that is about as useful to a gamer on a modern high performance gaming PC as a diamond encrusted platinum swastika is to a rabbi. I'm sure they'll find some special customers for these products, just like the makers of overpriced ($55,000.00 a pair!) speaker cables do, even though it has been shown audio kooks couldn't discern the difference between expensive cables and wire coat hangers.
I hope my favorite 'Minimum BS' PC magazine gets their hands on a recent model and puts it through the wringer. PC gaming enthusiasts have a myriad of better things to spend their hard earned cash on.
Heck, I may go buy one myself and put and end to this nonsense with a controlled scientific test.
Z9$^&$^03H#$^90efjdv28e0^#%$*( dR - Oops, sorry, I just slipped on a puddle of snake oil...
Saturday, May 8, 2010
Spies Like Us.
I'll be participating in another Bash webcast tomorrow. The subject is the increasingly cryptic goings on at GKNOVA6, the viral marketing website that appears to be related to the upcoming release of Treyarch's Call of Duty series, Black Ops. You can see the progress evolve by taking a look at GKNOVA6 at the Call of Duty Wiki.
Clever sleuthing by interested gamers around the world has led to the analysis and decipherment of a host of messages, including some craftily hidden steganographic text (see my blog entry Steganography, GKNOVA6 Style for the first example and details on how this can be done).
I've been invited to participate with Jock the host and his guests Josh Peckler from planetcallofduty.com and Carbonfibah, one of the seven recipients of the mystery packages with the USB keys that started the whole ball rolling and a major contributor to the code breaking efforts. Having a background in cryptography and secure systems, I hope to add some value talking about the various methods seen as this has unfolded, and the techniques applied to extract the results. We'll also be discussing the evidence pointing to this being a campaign from the minds of Treyarch.
This is the second recent instance of some very clever marketing by game developers: a few months ago, the popular game Portal from Valve Software had an update. The only material change seemed to be more of the cute portable radios were to be found throughout the game. What soon unfolded, however, is that each new radio had some kind of message. Some were normal sounding transmissions, a few were Morse code, some sounded peculiar, almost like an old FAX machine (turned out to be Slow Scan TV signals, a technique used by HAM radio operators), yet other radios only gave up their secrets when brought to just the right place in the game.
Careful ratiocination by followers of the game led to the decipherment of the various messages, which led them to a web site, done in an old-school bulletin board style, that when logged into led the user to Aperture Science Laboratories. Portal 2, anyone?
Stay tuned for updates on the webcast, I'll update this entry when it goes live! I can't tell you anything more about the details at this time.
"We could tell you, but then we'd have to kill you." - Dr. No
Update 05/09/2010: Had a blast, again, with Jockyitch and his cohorts on the BASH cast. Learned much from Josh and Carbonfibah about the evolution of the messages from GKNOVA6, and how the swarms of gamers worldwide went about deciphering them. Look for the cast at BashandSlash soon!
Update 05/10/2010: Webcast / Podcast is live, here!
Clever sleuthing by interested gamers around the world has led to the analysis and decipherment of a host of messages, including some craftily hidden steganographic text (see my blog entry Steganography, GKNOVA6 Style for the first example and details on how this can be done).
I've been invited to participate with Jock the host and his guests Josh Peckler from planetcallofduty.com and Carbonfibah, one of the seven recipients of the mystery packages with the USB keys that started the whole ball rolling and a major contributor to the code breaking efforts. Having a background in cryptography and secure systems, I hope to add some value talking about the various methods seen as this has unfolded, and the techniques applied to extract the results. We'll also be discussing the evidence pointing to this being a campaign from the minds of Treyarch.
This is the second recent instance of some very clever marketing by game developers: a few months ago, the popular game Portal from Valve Software had an update. The only material change seemed to be more of the cute portable radios were to be found throughout the game. What soon unfolded, however, is that each new radio had some kind of message. Some were normal sounding transmissions, a few were Morse code, some sounded peculiar, almost like an old FAX machine (turned out to be Slow Scan TV signals, a technique used by HAM radio operators), yet other radios only gave up their secrets when brought to just the right place in the game.
Careful ratiocination by followers of the game led to the decipherment of the various messages, which led them to a web site, done in an old-school bulletin board style, that when logged into led the user to Aperture Science Laboratories. Portal 2, anyone?
Stay tuned for updates on the webcast, I'll update this entry when it goes live! I can't tell you anything more about the details at this time.
"We could tell you, but then we'd have to kill you." - Dr. No
Update 05/09/2010: Had a blast, again, with Jockyitch and his cohorts on the BASH cast. Learned much from Josh and Carbonfibah about the evolution of the messages from GKNOVA6, and how the swarms of gamers worldwide went about deciphering them. Look for the cast at BashandSlash soon!
Update 05/10/2010: Webcast / Podcast is live, here!
Subscribe to:
Posts (Atom)