Wednesday, October 27, 2010
Schannel Event 36888 ( 10 / 10 ) When Playing BFBC2 / MOH / Etc. - WTF?
Specifically, the event log entry in the windows system log is:
Event 36888, Schannel
The following fatal alert was generated: 10. The internal error state is 10.
When I first saw the error myself, I recognized it from my network programming days as an informational error, indicating some kind of barf-o-rama on the server side of a secure connection handshake. Unlike most of the other Schannel event IDs, this particular one seems to remain undocumented. Nonetheless, the Info opcode and 10 / 10 Alert Description and Error State hint strongly at it being server side.
Since it seemed to have no material effect on the playability of the game(s), my interest in investigating it stopped there. A recent poster, however, indicated that disabling their AV (Trend) caused the apparently related game issues to be remedied. While it appears that the game itself runs correcly despite encountering the Schannel error, it may be that some A/V that muck with everything on the wire might take drastic action in the face of it. Strange if some do, but plausible.
In any case, barring some other application / utility causing problems (e.g., said A/V), the error itself can be safely ignored. If it really bothers you, you can change the logging level via a registry change by modifying (or adding if needed) the key:
HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\SecurityProviders\SCHANNEL
DWORD value EventLogging with a value of 0 will eliminate such event log messages. Note that current versions of windows seem to be more voluble for these errors - on older (e.g. XP), the error may occur without a log entry being generated.
I became interested again in this event / error recently while tracing the traffic activity of the game debugging a separate issue. Both games are built on the same engine / network infrastructure, so it is not surprising they share the same frailties.
From an outsider's view (since I have no access to the game source code, nor the EA master or game servers, my view must be the result of probing and testing theories, using debuggers and traffic sniffers), the network infrastructure for these games is a bit of a wreck. In the same way one might surmise their neighbor is a slob from observations of the trash bags thrown on their front lawn, the mishmash of connections and traffic these games generate is appalling. The possibilities of problems due to one piece failing or being unavailable are surely a source of grief for many attempting to play these games online.
If this system was designed this way from scratch, someone should be publicly whipped with a length Ethernet cable. If it is the result of 'evolution' of features and functionality by adding servers to the 'master' pool, the time has come perhaps for EA to rethink the infrastructure and rebuild it from scratch.
In any case, the Schannel error in these games appears to be generated by an improperly configured EA server that provides them with hardware information à la Steam's hardware survey.
Another way to eliminate the error (and stop spying by EA, if that's your stance), is to add the following to the file \windows\system32\drivers\etc\hosts:
127.0.0.1 bf1943hwtelemetry.ea.com
This prevents the game(s) from even starting the handshake process, short-circuiting the error path.
In summary: The error is harmless, it is not the cause of crashes / etc. in the game itself per se though it appears it might throw programs such as A/V into a tizzy (when I feel like it, I may investigate this further.) You can just ignore it, or if it bothers you having it in your event log, take one or both of the steps outlined above.
Saturday, May 15, 2010
The Alpha and The Omega of "Search" Engines?: WolframAlpha
Ask the engine something more precise, such as "United States population growth", and the character of the system begins to emerge:
Combine this with calculation capabilities second only to actually purchasing a copy of Mathematica to run locally, and you have truly phenomenal capacity for data gathering, analysis, and calculation at your fingertips.
Here, WolframAlpha shows us the integral of a complex equation integral (sin(theta)/cos(theta)^2^pi):
Give it a try, and visit the site at WolframAlpha for more details.
To give it a whirl right now, try typing "ISS now" (without the quotes) into the WolframAlpha box at the top of the blog entry, and pressing enter or clicking on the equals sign. Be prepared to be impressed!
Thursday, May 13, 2010
I've Got A Stalker: S.T.A.L.K.E.R. Shadow of Chernobyl, that is.
S.T.A.L.K.E.R.: Scavenger, Trespasser, Adventurer, Loner, Killer, Explorer, Robber.
I was blown away by the description and the screen shots shown, the graphics looked amazing compared to any shooter we'd played up to that time. The rumored release of the game was soon, so the juices started flowing for what looked to be a most excellent addition to the game collection.
Unfortunately, the game faced delay after delay, eventually resulting in a ninth place finish in Wired's Vaporware '06 contest. In January 2007, a contest for players to experience the game beta in a marathon session collapsed when the THQ staff (publishers of the game) that had organized the event were themselves unable to obtain copies of the game.
The game finally made its public debut at the end of March, 2007.
By fulfilling mission requests provided by all sorts of NPC in the game and by obtaining or otherwise finding valuables in the game, the player builds a stock of goods that can be sold or traded for items such as food, weaponry, clothing, ammunition, artifacts, etc.
The artifacts in the game play a key role in trading and player protection:

Most anomalies produce visible air or light distortions and their extent can be determined by throwing bolts (of which the player carries an infinite supply) to trigger them. Some stalkers also possess an anomaly detector, which emits warning beeps of a varying frequency depending on their proximity to an anomaly. The guide in the film Stalker, and his predecessors in the Strugatsky brothers' book Roadside Picnic, test various routes before proceeding. In the film, metal nuts tied with strips of cloth are used.
Anomalies produce Artifacts, the valuable scientific curiosities that make the Zone worth exploring monetarily. As well as being traded for money, a number of Artifacts can be worn so that they provide certain benefits and detriments (for example, increasing a stalker's resistance to gunfire while also contaminating him with small amounts of radiation). Artifacts are found scattered throughout the Zone, often near clusters of anomalies."
At first, I found this aspect of the game boring and time consuming. Little did I know I would soon become addicted to the hunt, particularly for the more rare items. I soon came to appreciate this type of game play, common in the MMORPG games such as World of Warcraft that I'd poked fun at.
Game play covers a huge area of many square miles. Originally, the game was to be completely open, but by release, the map had been subdivided into eighteen areas, each reachable through specific passages. Nonetheless, the game play always feels expansive, with superb draw distances.
This, combined with the wide range of choices in interactions and missions (Who do I want to befriend? Who do I decide to trade with? What items do I want for my character?) leads to excellent replay value in the game.
The game is based in the in-house developed X-Ray graphics rendering engine. From the Wikipedia entry:
"The X-ray Engine is a DirectX 8.1/9 Shader Model 3.0 graphics engine. Up to a million polygons can be on-screen at any one time. The engine features HDR rendering, parallax and normal mapping, soft shadows, motion blur, widescreen support, weather effects and day/night cycles. As with other engines that use deferred shading, the X-ray Engine does not support anti-aliasing with dynamic lighting enabled. However, a "fake" form of anti-aliasing can be enabled with the static lighting option; this format utilizes a technique to blur the image to give the false impression of anti-aliasing."
Even in 2010, the graphics hold up well, especially on high-end machines.
The A.I. system was also built in-house. The "ALife" system originally was to have NPC constantly active in the game world, regardless of player interaction. By release, this had been reduced in functionality. Nonetheless, the capabilities are quite robust, as described in the Wikipedia entry:
"GSC Game World's proprietary ALife artificial intelligence engine. ALife supports more than one thousand characters inhabiting the Zone. These characters are non-scripted, meaning that AI life can be developed even when not in contact with the player.
The NPCs have a full life cycle (task accomplishment, combat, rest, feeding and sleep) and the same applies to the many monsters living in the Zone (hunting, attacking stalkers and other monsters, resting, eating, sleeping). These monsters migrate in large groups. The non-scripted nature of the characters means that there are an unlimited number of random quests. For instance, rescuing stalkers from danger, destroying stalker renegades, protecting or attacking stalker camps or searching for treasure. The AI characters travel around the entire zone as they see fit.
Numerous tactics can be employed to complete the game, such as rushing or using stealth and sniping. The NPCs will react in a different way to each of them. S.T.A.L.K.E.R.'s NPCs plan ahead by "Goal-Oriented Action Planning" to achieve this."
S.T.A.L.K.E.R. uses a modified version of the ODE physics engine to provide rag doll physics and accurate bullet ballistics: Bullets are affected by gravity and ricochet off of surfaces.
Adding to the realism is a completely dynamic day / night progression of time, including weather effects such as rain, lightning, showers and sunlight.
Even at an age nearing four years, still a most worthwhile game, both for FPS fans and RPG players. That it is now available on Valve's excellent Steam system for under $20.00 makes this a no-brainer if you have not already played it. Since the version is the latest patch, most of the niggling bugs that were in the initial release have been remedied.
The follow-up games from the same developer, S.T.A.L.K.E.R.: Clear Sky released in September 2008 for North America, and S.T.A.L.K.E.R.: Call of Pripyat released in February 2010 fail to capture the magic of the original in my opinion. The former was an unmitigated disaster, bug ridden and having none of the flavor that made the original so engaging. The latter returned to more of the game play mechanics of its progenitor, and is arguably the least bug plagued of the three. Recommended for players that loved the original, but for others only if it can be had at a bargain price.
All three games in the series have an active "modding" community, providing new maps, characters, game and character abilities, and modifications of difficulty levels. This adds considerably to the game play value and longevity of the game.
Like many things in life, this is a case where the original is the best.
Highly recommended, grade A material.
Taking a Bite from the Poison Apple: Gaming on a MAC.
I like the marketing of the company: it is consistently slick, polished, and clever.

"What do you think of our super cool NeXT hardware?" he asked me. I responded with something like "It's an already outdated steaming turd that is only falling further behind Intel based stuff each day. Dump it. Dump any of the business that's hardware. Put NeXTSTEP on Intel, it's so much better than Windows, you'll rule the world." You'd have thought I'd told the Pope that God was a phony.
By the time NeXT got their act in gear and dropped the hardware facade and made NeXTSTEP available for commodity hardware, it was too late: Windows had evolved and had grown in market share to an insurmountable lead. What a disaster, a ship steered by an ego blinded captain. Not his first flop, nor will it be his last in my opinion. But make no mistake: Jobs could sell $50.00 bags of ice cubes to Eskimos, and have them believing his version tastes better. You've got to admire that.
I like that since hardly anyone uses the Mac product (under 4% of the market), hackers don't waste their time on attacking the machine. I don't like that Apple markets this under the guise of their machine being more secure than Windows machines: security through obscurity is useless, and the fact that Apple hardware is consistently the first to fall at white hat security hacking contests demonstrates this. Nonetheless, in the same way that no city burglar is going to go out of his way and drive a hundred miles into the countryside just to rob you, the platform is much less likely to find itself under attack. For now at least.
I like that the very narrow range of hardware options used (and controlled) by Apple makes life easier for their OS developers. Stuff just works. That Windows works so well with the innumerable combinations of hardware it can be installed on is miraculous, Apple chose to simplify the problem and has done a superb job of it.
I like the support infrastructure of Apple. This is one area that I've yet to see anything even close in the traditional PC world. The Apple store staff knows their stuff. The genius bar really knows their stuff. A user of the product can get face to face, quality technical support for zero or minimal cost, instead of spending hours talking or on-line chatting with tech support via some version of the typical cookie cutter outsourced staff.
All of this boils down to this: The Mac is the machine least likely to be bought by me, and the most likely to be recommended by me. Except to serious gamers. Allow me to explain.
When friends come to me seeking advice for a PC (I'll use the term generically to mean both traditional hardware and that of the Apple persuasion), I ask them some pretty simple questions.
If the answers indicate that they do not have a need for the latest and greatest hardware, add-in cards, etc. I usually point them to Apple. The machines are easy to learn and use for the novice. They tend to "just work" due to Apple's vice-like grip on the narrow range of hardware components allowed in the machine. The support infrastructure Apple provides means if I'm not around to answer a question, a quick phone call to Apple or a visit to the local store will usually result in rapid resolution. Keeps them off my back.
But for the serious gamer? Well, as Apollo 13 hero Jim Lovell said, "I believe we've had a problem here."
There are a few things standing in the way of the gamer using a Mac that wants state of the art hardware for maximum performance with modern games.
Firstly, excepting the deskside Mac Pro models, there is no real means to update the anemic graphics hardware in the Apple machines. Some of the higher-end MacBook models are capable of running games with acceptable frame rates, but the really sophisticated bleeding edge titles are off-limits if acceptable performance is expected.
Even with the Mac Pro, graphics hardware options are severely limited if the user wants to retain the full range of Apple specific functionality and sensibilities in the interface from the time of powering up the machine: the cards used by Apple require proprietary firmware (for EFI and the OS), meaning non-certified cards will not function properly in OSX, nor will proper boot screen functionality be retained.
This means the user is limited to Apple specific cards if they wish to retain these capabilities and functionality, and these cards tend to severely lag behind the current state of the art as far as performance capabilities. By way of example, the fastest card at the time of writing on the Apple Mac Pro web page is the ATI Radeon HD 4870, a card released two years ago.While there are some third-party cards of higher specification available, these too are at least a generation behind the state of the art. And of course, either solution carries the burden of the "Apple tax": you will pay more for the same card compared to the PC version.
It is possible to do what is effectively "brain surgery" on more modern cards via firmware manipulation to enable use in a Mac Pro, but the likelihood of reduced functionality and performance or of producing an expensive paperweight by such antics far outweighs the benefits. See the entries at everymac.com and the Expansion Cards and Hardware Compatibiltiy sections of the Wikipedia entry for the Mac Pro for a glimpse into the grief faced by users that need more GPU horsepower in the Mac environment.
Yet even then, the Mac user is boxed in: the latest high performance GPUs are quite power hungry. One may tax the power supply of the Mac Pro, dual cards (SLI or Crossfire) would be out of the question without cobbling some sort of Frankenstein power supply to supplement or supplant the one that comes in the machine.
Secondly, the Mac gamer is faced with the reality that mirrors the disinterest in the Mac by hackers: By and large, game developers don't give a hoot about the Apple environment. The Apple store lists 70 titles. Total. A tiny handful of FPS games.
This means that the Mac owner, if they want to play most any current game, will need to do so using Microsoft Windows. No need to rush out and buy a PC to supplant your Mac because it can't do something you want, however. There are a few ways a Mac owner can play Windows games utilizing their Mac hardware. We'll outline these here.
For simple games (2D, scrollers, etc.) with lightweight graphics performance requirements. a virtual machine environment such as Parallels Desktop or Vmware Fusion will allow the user to install Microsoft Windows and the games they desire into a Virtual Machine. This effectively creates a "computer in the computer", and for simple games will allow the user to play the game without leaving the OSX environment. My own experiments show reasonable performance on a higher-end Mac Pro, so long as the game's graphical requirements are kept reasonable. For games with more rigorous requirements, the performance in a virtual environment severely lags behind that of running on native hardware.
For these kinds of games, the user will need to install Windows in a fashion that allows for native booting. This can be accomplished with Apple's own Boot Camp or in a more flexible but more involved implementation using rEFIt.
Boot Camp provides a trivially simple mechanism for the gamer to get up and running on Windows games on their Mac hardware. The "Boot Camp Assistant" of the installer will walk the user through automatic repartitioning of the disk and installation of Windows. The current OSX install discs contain the hardware drivers for Windows components of the user's Mac, simplifying the installation and configuration of Windows considerably: no need to ferret these out from the web. The details and guides for using Boot Camp can be found at the Apple support page for the product.
rEFIt is an open source EFI boot loader and toolkit for the Mac. Unlike Boot Camp, which is limited to one alternate installation of Windows alongside OSX, rEFIt gives the user the functionality of a traditional boot manager, allowing the installation of multiple OS systems. On power up, the user is presented with a boot menu showing what operating systems are available for selection.
For either the Boot Camp or rEFIt solutions, I would recommend the gamer determine the precise hardware configuration of their Mac and acquire the latest Windows hardware drivers from the appropriate sources before starting the OS installation process. Often only the most current drivers will provide the desired gaming experience for the newest games (graphics card drivers being particularly fickle.). At the very least, ensure that you have the needed network (NIC) driver, so that once Windows is installed and booted, you can retrieve other needed drivers from the Internet.
You'll also want to get your hands on a decent keyboard and mouse. While the Apple realizations exude a Bang & Olufsen elegance, they're utterly useless for any kind of real gaming.
See you on the battlefields!
I Know What You Did Last Summer: Omniscient Debugging.
One of the more interesting and very useful features added (only in the Ultimate edition, unfortunately) is IntelliTrace historical debugging. The ability to debug an application while in effect manipulating time (going backwards in the execution history, for example) is known as Omniscient Debugging. The term is properly used when referring to the eponymous open source debugger by Bill Lewis, a computer scientist at Lambda Computer Science and Tufts University, but has become part of the vernacular and a synecdoche for such debuggers.
For readers that are not programmers, the "debugging" of programs is part of the life of any developer. Any sufficiently complex application is likely to have a few "bugs", cases where the expected behavior of the application is not observed to happen. This can be caused by any number of things, from incorrect specification of the design, mistakes in translating the design into program code, use cases that were not considered, bugs in the very tools used to produce the application, etc.
By way of example, let us consider the following tiny application (written in a form of psuedocode):
(define add_two_numbers using Number1, Number2 as (return Number1 * Number2))
The developer hands this off to the users, who start using the new routine add_two_numbers:
add_two_numbers 0,0 → 0
add_two_numbers 2,2 → 4
add_two_numbers 4,4 → 16
Oops! The first two test cases return the proper expected results, but the third returns 16 instead of the expected 8. In our admittedly trival example, we can see from visual inspection the programmer mistakenly put the multiplication operator "*" where the addition operator "+" should have been.
In real applications, mistakes may be buried in hundreds of thousands of lines of program code, and visual inspection is seldom practical. This is where the traditional debugger comes into play for the developer.
A debugger allows the programmer to observe the execution of the application under consideration, by showing exactly where in the execution stream the application is, and allowing the viewing and changing of variables and storage for the application, setting "breakpoints" where the programmer wishes to temporarily halt the execution, stepping through the execution one step at a time, stopping when certain conditions are met, and in some cases even allowing the changing of the program's state.
At that point, the developer could step through the execution of the routine, observing the flow through the source code step-by-step. The source of the problem would of course become apparent, and could be quickly remedied.
In the case of severe bugs, an application may "crash". In some cases, a "crash dump" will be produced by the execution environment. This contains information about the state of the application at the time of the crash. This information can be used by the developer with the debugger to analyze the state of the application when it crashed, and observer the contents of variables, the place in the source code the execution was at the moment of the crash, and basic information about the path by which the application came to that place.
A core problem with this model of debugging, however, is that detailed information about the execution path and program states leading up to the bug or crash is generally not available, even in a detailed crash dump file. This forces the developer to deduce where the problem might be and then test that hypothesis with the debugger by setting conditions and breakpoints while analyzing the application flow. This will usually lead to a more refined hypothesis, leading to further debugging, leading to a more refined hypothesis, nearly ad infinitum (or at least it can feel that way) until the problem is found. In cases where the bug is sporadic, the programmer may have difficulty even reproducing the problem.
This can be a very time consuming and taxing practice for the programmer involved!
Ideally, the programmer would be able to observe the state of execution for the application at any point in time of the execution lifetime.
That is precisely the capability that omniscient debugging brings to the tool set of the programmer. The concept of such historical debugging has been around for decades, but implementations proved impractical for most of this time. The first truly practical and usable example is surely the Omniscient Debugger (ODB) of Bill Lewis. A very detailed paper by Bill can be seen in Debugging Backwards in Time, and a most interesting video lecture of the same name is hosted by Google. You can view these to garner details on how this functionality is achieved, with examples of the kinds of bugs that are vastly simpler to find by taking advantage of this kind of technology.
With Visual Studio 2010, Microsoft has incorporated and extended this capability and technology, calling it IntelliTrace. Supporting the historical debugging of managed C# and Visual Basic, with experimental support for F#, the product promises to make the work of developers on the Microsoft platform vastly more productive. Details of the new capabilities can be found in the MSDN articles Debugging With IntelliTrace and Debugging Applications with IntelliTrace.
Open source developers can avail themselves of historical debugging with the aforementioned ODB, TOD and GDB debuggers.
I know what you're thinking. More importantly, I know what you were thinking...
Monday, May 10, 2010
Always Be Reading A Book That Will Make You Look Good If You Die In The Middle Of It.
"Outside of a dog, a book is man's best friend. Inside of a dog it's too dark to read. " - Groucho Marx
A good bookshelf is a must for any serious devotee of hardware and software. Myself, I stopped counting when I crested the 1000 book mark for my computer science related texts. As learning tools, reference works, and sources of pleasure, a good computer book is hard to beat in my opinion. With that in mind, I thought I'd make an entry listing the top tenish books that I think every serious programmer should have in their shelves.
These are not listed in any particular order other than that which they came to mind, and are operating system and language agnostic for the most part.
The Art of Computer Programming; Donald Knuth
All three volumes of course, plus the five fascicles that presage volume 4. Never has so much CS knowledge been packed into such a small space. This truly is a CS degree in a box.
Elements of Programming; Alexander Stepanov
Programming is mathematics. Seldom is this fact given proper treatment. This book remedies that brilliantly.
Introduction to Algorithms; Cormen, Leiserson, Rivest, Stein
A standard reference for professionals, and a widely used university text. Half of a CS degree is in this book.
Modern Operating Systems; Andrew Tanenbaum
The classic all-around reference and learning resource for the details of the inner workings of operating systems. From the creator of Minix, a Unix like teaching OS that was the template for the first Linux.
Beautiful Code; Andy Oram, Greg Wilson
See how luminaries in the programming world have created elegant solutions to difficult problems in a variety of programming paradigms and languages.
Masterminds of Programming: Conversations with the Creators of Major Programming Languages; Federico Biancuzzi, Chromatic
A superb 'behind the scenes' look at how historic and influential languages came to be, told by their progenitors.
Code Complete: A Practical Handbook of Software Construction ; Steve McConnell
A classic, recently updated. Certainly one of my 'desert island' picks for instruction on the proper design and construction of software.
Programming Windows; Charles Petzold
It's dated. It's also still the classic reference for Windows programming. "Look it up in Petzold!"
Windows Internals; Mark Russinovich, David Solomon, Alex Ionescu
There is no better publicly available reference for what makes Windows tick than this tome from the 'ghostbusters' of the Windows world.
The Algorithm Design Manual; Steven Skiena
Implementations and explanations of all of the important algorithms, and humorous to boot.
Structure and Interpretation of Computer Programs; Harold Abelson, Gerald Jay Sussman
This classic, often called 'the wizard book', was the text for MIT's introduction to programming classes. A rigorous lesson in how to think about programming.
Compilers: Principles, Techniques, & Tools ; Alfred V. Aho, Ravi Sethi, and Jeffrey D. Ullman
The best survey, in my opinion, of the inner workings of compilers, with a sharp view of how design decisions affect the user.
I think if I awoke to my house on fire, after getting my family out, these would be the books I'd want to save from the blaze. I could happily settle for them being the only books on my personal shelf.
What's on your own list that you'd bring to a desert island?
Saturday, May 1, 2010
Maximum PC: Maximum Content.
There used to be a myriad of well put together PC centric magazines. Sadly, it seems the market for these is not sufficient to support the print publishing side for most of them. Only the strong have survived. Of the few remaining, most have devolved into Reader's Digest style shadows of their former selves, catering to a pretty dumbed-down demographic.
If you're a developer, Dr. Dobb's still has a constant stream of fairly deep and interesting articles that any coder can enjoy. For those that aren't coders, where Dr. Dobb's might be too deep, that still want a slightly more technical bent from a hardware and software standpoint, CPU fits this need well, along with decently in-depth hardware and software reviews.
If you're on the Linux side of things, your choices widen. Linux Journal provides a Dr. Dobb's feel in a Linux centric rag. Linux Magazine and Linux Format provide the novice with a gentler format (but be prepared to check your bank account before subscribing: the cost of these imported periodicals is wallet numbing!)
My desert island choice is MAXIMUMPC. The technical staff there is almost always on the ball, seldom making any glaring errors. The reviews are quite in-depth, you'll have to find a specialist web site to exceed their detail (has anyone figured out the meanings of the occasional random 'factoids' placed in the specifications of reviews? Enquiring minds want to know!)
Since it's my desert island pick, it has fortunately started covering Linux in the recent past, so I'll not miss out too much on happenings in the alternate OS universe.
The writers have a sense of humor sharp enough that it has on occasion generated controversy and feedback from the more uxorious readers. You've got to love it! Always interesting, sometimes controversial, I've been an avid reader since the days when it was called Boot.
Add to this that the publisher has seen fit to make available PDFs of back issues dating all the way back to 2003 at MAXIMUMPC PDF Archives, and you've got to want to support these guys. Their No BS Podcast series is one of the few I'm interested in listening to. Finally, the magazine supports a very active forum community at MAXIMUMPC Forums.
It is the one PC magazine I wait for like a cat hearing a can opener.
If anyone knows of other worthwhile PC centric magazines, I'm all ears! I miss the days of having a panoply of choices.
Wednesday, April 28, 2010
Swimming in the Septic Tank with my Gaming Buddies.
C'mon in, water's warm! And there's flotation devices all over the place!
Seriously, you wouldn't go swimming at your local water treatment facility, so why on earth would a serious gamer try to run their games in an environment that is even more unhygienic?
I helped a few posters recently with a puzzling 'lag' issue. They would enter the game, but each spawn was met with a many seconds delay, and in some cases, the same puzzling delay would happen when accessing the game menus.
Turned out to be their anti-virus software getting in the way, and either disabling it, or adding all the game directories to the exceptions list, fixed the issue. I see this all the time, where some software turd causes a problem with a game, slowing things down or otherwise interfering with the game. Anti-virus, anti-malware, peer-protection, printer drivers, iTunes, Quicktime, peripheral drivers, etc., the list of cesspool floaters is endless.
Which got me thinking about something I ponder about periodically: Why on earth would a serious gamer have anything but the leanest, meanest OS environment for playing their games? What on earth is the reason to have an anti-virus running with a game that is from a trusted source (if you're stealing games, that's another story)?
I myself have always had separate Windows installations, one hardened for day-to-day activity and any uncontrolled network access, the others having only the drivers and software needed to play my games. Gives me the protection I want for regular activities and maximum performance, minimum hassle for gaming.
Many use things like Hardware Profiles (unfortunately deprecated in Vista and beyond) to 'minimize' unnecessary system load, but that can be cumbersome. Others use snake oil programs that purport to improve game performance by shutting down system processes and optimizing memory. I'll not argue the myth of messing with MS system processes, and what effect (none) it has on game performance. This is just another messy and questionable 'tweak'.
It is trivially simple to start with a clean Windows install and clone it to a separate partition to provide a multi-boot environment where one copy is hardened, the other is for gaming. The web is full of helpful tutorials that can guide even the most novice of users through the process. Users with the Enterprise and Ultimate versions of Windows 7 that allow native booting from VHD, can get equivalent functionality without the need to partition the hard disk at all.
That is the environment I use: Multiple VHD, each with exactly the environment needed for the games they contain, tuned and optimized (some games 'prefer' certain drivers, etc.), and only running things absolutely needed for those games. No A/V, no firewall, no junkware. Nothing that could affect the performance or otherwise interfere with my games. Another VHD contains a fully hardened Windows installation, with a combination of anti-malware/firewall/security that ensures safe passage when navigating the sewer we call the Internet.
A side benefit is that by using Differencing VHDs, all of this is done with very minimal space requirements: I don't have to duplicate the space required for each Windows installation, saving hundreds of gigabytes of storage. With Windows 7 Ultimate/Enterprise, I can boot 'natively' (that is, to the bare metal: a normal boot running at the full speed and capabilities of the hardware) or boot using a virtual machine to any of these VHD installations.
I can even 'boot' into the hardened environment using a virtual machine while already booted into one of my gaming environments should I need to download or otherwise access the web, without needing to reboot the machine. I can grab something from the Internet, have it completely scanned in the virtual environment, then drag-and-drop it into my gaming environment.
Maximal security combined with maximal speed. Another benefit to the use of a virtual machine in this case is the ability to snapshot the virtual hardened Windows, so if I do get exposed to some nasties, I can rollback time with the click of the mouse. Cool! Should some really extreme corner case of attack or malware successfully corrupt or infect one of my 'game' environments, it stays isolated to that environment, and I can restore it from backup ludicrously quickly (another Differencing VHD benefit).
I go ahead and run everything in the 'game' environments under a 'real' administrative account, since even in 2010, there are developers that seem to still be incapable of writing a properly behaving userspace/usermode game (see Pings? We Don't Need No Stinking Pings! for an example.)
If you're interested in trying out this kind of setup, there is one caveat: You must be disciplined. No network activity in the naked 'game' environments other than the game and game related functions. No installing or running anything not from trusted sources. Use the hardened environment, either natively or in a virtual machine for everything else. Otherwise, you're swimming in the septic tank. Surrounded by hepatitis infected needles. Naked. Do note, if you choose to run 'naked' like this, you'll still want some kind of isolation from the baddies on the WAN. If directly connected to the WAN (i.e., your PC IP is public), running the windows firewall should be part of the setup - even with hygienic use of the naked setup, this will minimize external attacks. Even better (since the windows firewall can be problematic with some games), keep it off, and use a proper router with NAT (and its own firewall enabled if you wish). NAT done correctly will keep the outside world at bay, and can be easily configured if needed (seldom) for specific games. Router firewalls can be more robust than the built-in windows system, and reduce load on the CPU for firewall tasks. The overhead of going through a router, firewall enabled or not, is negligible, and the protection provided warrants their use, in my opinion, even if your PC is the only device on your LAN.
The benefits of this kind of setup seem so overwhelming to me (absolute security combined with the leanest, meanest yet perfectly 'tuned for games' environment, with a significant reduction in the inevitable conflicts between different installed software that can be arduous to troubleshoot), I can't imagine why every serious gamer wouldn't want the same. No firewall, no A/V, no anything to get in the way of the game. It's better than running on a nude beach while sipping a Cialis spiked cocktail! Minimal encumbrance, maximum performance! Try it, you'll never look back.
I ♥ Windows 7 Event Logging!
Two of the really cool features are the ability to build sophisticated custom filters and views, allowing you to focus on exactly what you're looking for, and the ability to create events on events.
For example, the following XPath query for a filter/view allows me to trap a specific event for a specifc PID, excluding all other 'noise' in the event log:
<QueryList>
<Query Id="0"
Path="Microsoft-Windows-Winsock-AFD/Operational">
<Select Path="Microsoft-Windows-Winsock-AFD/Operational">
*[System[Execution[@ProcessID="3141"]]]
and *[System[EventID="1000"]]
</Select>
</Query>
</QueryList>
Equally cool, I can assign tasks to any event/view/filtered view such that the system will notify me with a dialog, E-Mail me, or start any arbitrary program.
Hugely useful, allowing efficient, quick & dirty debugging without even needing to fire up a real debugger, and as an incredibly useful ancillary to formal debugging.
Sunday, April 18, 2010
One of the more interesting game problem cases I've worked on.
All was well until a couple of weeks ago, then load times for the maps went to hell, taking five to eight minutes, with the screen alternating between white and black. Once loaded, selection of menu items or login resulted in another minute or so of stalling.
The poster was using one of the highest-end and newest GPUs that had a known minor issue with load times for the game (but usually only resulting in 30 second or so loads for others, with no stalls once loaded) that had been remedied in a recent GPU driver release.
The poster, and an apparently savvy friend, were pulling their hair out, having tried every option reasonable: reinstall game, windows, no anti-virus, different drivers, disable sound, etc. but still no joy.
This was one of the weird ones. Like the PC of an old friend that would randomly play "It's a small world" out of the internal speaker (that took me a bit to figure out...)
I've been doing this a long time, as many of you probably have, giving us an advantage: We've probably seen it before, and had to figure it out.
The problem: USB bus conflict/flooding. I first saw this a few years back on a fellow gamer's machine, after trying all the obvious things to resolve their strange performance issues, I pulled out my USB bus analyzer, and bingo!
For this poster, simply unplugging all USB devices, then plugging them back in completely solved the issue. The same (actually more thorough) can be done by uninstalling the sub-branches of the USB device branch and rebooting the machine, allowing it to reinstall and enumerate the devices.
This very same problem manifested itself for a couple of players I worked with where their game ran in slow motion, as if time were slowed down. Not laggy game play, in fact no jerkiness of any sort, just six million dollar man slow motion effects for everything in the game!
A cool effect of this was they could FRAPS record game play, and when played back, it was normal speed. Spooky! When I had them do this, I'm sure they must have thought I was pulling some kind of practical joke on them.
The same fix (resetting USB by unplugging their USB devices) remedied the problem.
These strange and obscure ones must make life interesting for tech support.
Wednesday, April 14, 2010
Yes, Virginia, 32-Bit Windows can use more than 4GB of physical RAM...
Let's set the record straight.
YES! 32-bit versions of windows can take advantage of more than 4GB of physical RAM.
Example 1: The server editions.
I refer the reader to Memory Limits for Windows Releases for the canonical document.
On these systems, the application is of course limited to 2GB of userspace virtual address space. Using the (again, oft misunderstood) /3GB switch, applications that are compiled with the appropriate flag can avail themselves of a 3GB virtual address space. Utilizing processor and OS PAE capabilities, and Windows APIs such as AWE, or via Kernel mode drivers, such applications can map whatever physical RAM address space the machine exposes into their particular virtual address space. This is how things like SQL Server can have huge memory use (well over any 4GB limit) on such versions of the OS.
Far too many Internet 'experts' advise readers to use things like the /3GB switch to 'increase how much memory your program can use', completely missing that fact that the application must be compiled using the appropriate flag to do this. This can be done 'after the fact' with EDITBIN or a Hex editor, but unless you're sure the coders of the application were sane, or you've otherwise analysed the binary, you might be asking for trouble. In addition, programs compiled with '/GL' (Whole Optimization) cannot be modified using accepted post-compile methods. The end result is the reader ends up burning 1GB of address space that the OS could use for the Kernel, that ends up just sitting there wasted. I would direct the interested reader to the aforementioned Raymond's blog for an amusing, multi-year expose of this.
Example 2: The 'client' editions.
Things get a little more confusing for the 'client' versions of Windows (the consumer versions.) For these, Microsoft decided after rigorous testing that the problems exhibited by drivers and other low-level code would present too many problems when faced with RAM addresses beyond 4GB, and so disabled the 'awareness' of the OS for these versions of the OS for RAM beyond 4GB.
However, properly coded applications and drivers (e.g., Ram Disk software by SuperSpeed) can in fact utilize RAM beyond 4GB even in the 'client' 32-bit Windows versions. I used to do this every day, until the remaining reasons not to use a 64-bit OS were eliminated for me.
Here's a screenshot of this very application, running on a friend's PC with 6GB of physical RAM, providing a 5GB RAM Disk. Note that the 2GB of physical RAM above 4GB is being used perfectly well by this application running under 32-bit Windows (click to see large original):

There were valid reasons for using such 'work-arounds' for 32-bit systems to use more than 4GB of RAM in the past: driver vendors had not kept up with 64-bit systems, and often had either no drivers, or problematic ones. Those days seem long gone, and at this point, there's really no good reason not to use a fully 64-bit capable operating system, even on consumer PCs.
Nonetheless, the fact remains: Yes, Virginia, 32-bit windows can use more than 4GB of ram.
Update:
Finally! Someone at Microsoft read this, perhaps? As of early 2011, the Microsoft Developer Network's canonical document "Memory Limits for Windows Releases" has been properly updated with the correct facts:
X86 client versions with PAE enabled do have a usable 37-bit (128 GB) physical address space. The limit that these versions impose is the highest permitted physical RAM address, not the size of the IO space. That means PAE-aware drivers can actually use physical space above 4 GB if they want. For example, drivers could map the “lost” memory regions located above 4 GB and expose this memory as a RAM disk.
About time...
Sunday, March 21, 2010
Hyperthreading, Win 7 Scheduling and Core Parking, A Simple Tale
So, here's my go at it.
Think of it this way. You have a bank, with four tellers (cores). Many customers (threads) waiting to get serviced (CPU time on a core). These tellers are pretty bright, they can each multi-task (hyperthreading) between two customers (threads on each logical core). They can each have two customers (threads) at their window, but can only work with one customer at a time (there is after all only one physical core per two logical cores of the CPU), but if that customer has to fill out a form (the thread stalls, or sleeps, or has used its fair share of CPU time), the teller can work on the next customer (thread) in their line (the two hyperthreads per core). They simply make a note of what they are doing with the first customer (the thread CPU states), put it aside (each core in the CPU has two areas per core to track state), and start working on the second customer (thread). The tellers (cores) can work very efficiently with these two customers (threads), and can do so for as long as the bank manager (Operating System) lets them, because they basically have everything they need to know about those two customers (threads) right at their fingertips (the two areas of each core that hold this information.)
Above all of this is the bank manager (the OS). The manager decides which two customers (threads) are at each teller (core) at any given time, and can swap one of those customers (threads) with another waiting customer (thread) in the bigger line of waiting customers (the whole thread pool for the OS).
Now it so happens that this swapping slows the servicing of the customers (threads), so the manager (OS) avoids this at all costs. In addition, the manager (OS) knows that the more work (threads) he can keep on as few tellers (cores), the better. In fact, if the manager (OS) can, he'll put a teller (core, real or logical) on break (CPU parking), not having to pay them during this time (energy savings for the CPU.) Even more interesting, the manager (OS, actually OS and CPU features) knows that if he can push as much work on to a few tellers (cores), these tellers drink a big cup of coffee and work even faster than normal (turbo-boost). The manager (OS/CPU features) knows that there's not enough coffee to go around to all the tellers (cores), and that all of the tellers (cores) cant all be working at the faster rate, so the manager (OS/CPU features) tries to keep as few tellers (cores) active as possible, so long as he thinks it won't affect the overall servicing of the customers (thread pool).
Playing with things like the manager's decisions (thread scheduling) by overriding him (playing with affinity) can force all of the tellers (cores) to do work, even when the manager (OS/CPU features) would dictate this is not the most efficient way to do things. It will likely have no effect on the rate of servicing of the whole customer collection (thread pool), and may in fact cause all of the tellers (cores) to work at normal speed (no turbo-boost), slowing things down in reality.
The manager (OS) knows best, that's why you hired him. Barring him being drunk (a bug in the OS or CPU scheduling logic), he'll usually make better decisions than you.
Thursday, March 18, 2010
Port Forwarding: Slaying the Mythical Dragon of Online PC Gaming.
Most blindly follow this advice, not even understanding what it means to 'forward a port', much less the ramifications of doing so. My intent is to set the record straight for the reader, so that they may better understand the how, what, when, why, and where of port forwarding. I have greatly simplified and generalized the terminology and examples, which may offend experts, but is appropriate for the intended audience.
To be clear, forwarding of ports is seldom if ever required to allow the client of the online PC game to function properly. Unnecessarily forwarding ports is not only undesirable, it may expose the gamer to security risks, and can interfere with proper functioning of their environment, including games.
The typical PC gamer has a pretty simple environment: Their PC, a router, a modem (perhaps a unit that combines the two functions of router and modem), and...and that's it. The router serves the function of shepherding traffic from the gamer's local area network (LAN) to the wide area network (WAN), where the online game servers 'live'. The modem provides the electronic means for the gamer to access the WAN infrastructure. In some cases, these two functions (router and modem) are combined into a single unit, variously called a router or modem, depending on who you're asking. Often, gamers have a router in their environment without knowing it - they've been told 'that's your modem'.
Why these pieces of hardware are used comes down to the subject of addresses. Each PC in a network must be assigned a unique address. The gamer is probably familiar with these. They're the number sets like '192.168.1.123' you might see for you PC on your LAN, or the '74.125.19.106' you might see if you ping http://www.google.com/. You've probably heard them called the 'IP address'. The important thing is that each PC must have a unique address. Much like your mail goes to a unique address, if different households could have the same address, you can imagine the mess that would ensue.
Now early in the days of the 'net', the groups defining various standards and protocols decided it would be wise to have addresses that were 'public', that is, known to the world as the address to send to, and 'private', that is, addresses that the 'outside' (WAN) world can't even see. This was done for many reasons including reducing the need for public addresses to be used, and to allow enterprises to split up a 'public' address into one or more internal 'private' addresses.
The router's primary function is to manage, control, and manipulate the barrier between the 'private' LAN and the 'public' WAN.
In a typical environment, the modem provides the connection to the WAN, giving the user on the 'inside' of the modem connection some public IP address on the WAN assigned by their ISP. The router takes the traffic from the PCs on the LAN and passes it on through the modem to the destination server on the WAN. We'll call this the 'request' to the server. The server does whatever it needs to process the request, and responds to the WAN address of the gamer. We'll call this the 'reply' from the server.
The router will keep track of requests sent out to the WAN, and in general only allow traffic from the WAN to a PC on the LAN if it determines that traffic is an appropriate reply from a server to a request from a PC on the LAN. Now the router/modem usually have one, and only one public WAN address assigned to them. What are we to do if we have several PCs on our lan that all want to make requests to the same server on the WAN and get their respective replies? The router does this for us through a mechanism generically called Network Address Translation, or NAT for short. There are many details we won't delve into here, a good overview can be found at http://en.wikipedia.org/wiki/Network_address_translation, with some useful references. Readers that wish a more in depth treatment might use the superb books by Comer at http://www.cs.purdue.edu/homes/dec/.
The problem NAT solves is analogous to sending mail between two apartment buildings. We know the street address where we want to send it (the IP address), and the apartment number. In the IP world, the apartment number is called the 'port'. For our PC game, the game client (what the gamer plays) needs to send requests to the game server(s), and it does so by sending requests to the IP address of the server, and including the port that address should go to on the server. The request needs to have a 'return address' so the server can reply, so the game will add the address of the game client, and the desired return port to the request.
Now as we've said, the client is on a private address. The server can't see this or do anything with it. So the router changes the address information, replacing the private LAN address with its public WAN address, and remembers the return address port for the request. If more than one PC on the LAN make a request to a server and specify the same return port, the router notices this, and changes the return port along with the return IP address, keeping track of which PC corresponds to which requested return port the router sends in the request to the server. When the server replies, it uses the return address of the client, which will be the public IP (WAN) address of the gamer, and the return port, which may have been changed from the actual return port by the router.
When the router sees this traffic, it peeks into the packet and determines which PC belongs to the requested return port. The router changes the return port to the one originally in the PC's request, if needed, changes the return IP address to that of the correct PC, and passes the traffic onto the LAN, where the PC that made the request will receive its reply from the server.
In general, we don't want random traffic coming from the WAN into our LAN. Because the router peeks into traffic to determine if it even belongs on the LAN, random attempts to enter the LAN are thwarted. Unless the user specifically needs to have requests from the WAN enter the lan (to a server of some sort on our LAN to reply to), this is precisely what we desire. Routers usually include some kind of 'firewall' capability, which considerably enhances the security of the client<->server interchanges, and provides even more probative capability toward unsolicited traffic from the WAN. We will not detail firewall functionality.
What if the gamer needs to have a server on the LAN that can be accessed by others on the WAN? How might we accomplish this? This is where the feature of the router called 'port forwarding' comes into play. The user can configure their router, and set it to allow traffic from the WAN to its WAN address into the LAN. The user does this by specifying what PC is going to reply to traffic on which port(s). For example, if we wanted to run our own web server on our LAN (or game server, just change the nomenclature and numbers), it would need to get requests on port 80, the default port number for HTTP (browser) traffic. If the PC running our web server on our LAN had an address of say 192.168.1.2, we would configure the router to forward any traffic from the WAN to its WAN address with a destination port of 80 to the PC at 192.168.1.2. When the web server (or game server) replies to the request, it is sent through the router back to the WAN address of the original requester. The same kinds of manipulations to the address happen via NAT as with the game client example, just in reverse. So forwarding is for clients on the WAN to get to a server on your LAN. Pretty simple, no?
Now, to kill the dragon!
Modern PC games played online need the game client to make requests to the game server. The game server, and other game clients, do not make requests to the game client. There are exceptions to this, namely some peer-to-peer games, and cases where one of the clients is also running the game server on one of their PCs. Both fall into the generalized description of a server from earlier. But in general, modern games are client-server based, where the server is run by a provider on the WAN, and the gamer plays the client on the LAN. At no time do the servers try to make an 'inbound' request to the client. Hence, forwarding of any ports to play the game is completely unnecessary, and accomplishes nothing. Forwarding ports when not explicitly required poses a security risk to the user, and can in fact interfere with proper traffic flow for games.
The game's client makes the requests, the router handles the manipulation and shepherding of the traffic to the server on the WAN and the corresponding reply traffic from the server on the WAN to the game client on the LAN. Not the other way around!
Unfortunately, 'You need to forward your ports!' is one tough dragon to slay, and this myth is constantly perpetuated in forums, and even by occasionally by misinformed game publisher support staff. There are even whole web sites devoted to the subject, with applications to automate this unnecessary and potentially security compromising router feature for the uninformed user.
Unless you are instructed that your game requires ports to be forwarded from an authoritative source (the game manual, the game developers, or in some cases the publisher with the caveat noted earlier), you are likely not required to do it. Abandon all hope ye that consider enthusiast forums to be an authoritative source!
To humanize it, think of it this way: You, in your household, act as the 'router' and 'firewall' in a way for traffic in and out of your house (your LAN). You, and others in the house are free to go out from the house to seek information (onto the WAN). When someone comes knocking at the door with the answer, you can peek through the peephole on your door and decide if you expected them, and let them into your house. If a stranger comes knocking, you're likely to decide they're uninvited, and not let them in. Port forwarding is giving a stranger the key to your door. In fact, its giving the key to your door to everyone in the world that knows how to get to your door! The 'experienced gamers' and 'net experts' that tell you there's no danger in forwarding ports when it's not explicitly required are doing just that: telling you it's OK to give the key to your front door to everyone on the planet. I'd venture most intelligent readers wold never subscribe to such nonsense.
How many readers in game enthusiast forums do you think blindly forward ports to 'fix' problems? How many of those same readers will download the latest coolest 'tweak tool' for the game when offered up on the forum? How hard do you think it would be to perhaps list some real ports for the game, and throw an extra one in that the later downloaded 'tweak tool' actually listens on, allowing a remote intruder in to the victim's PC to run amok? If you don't know how to verify exactly why a game should need ports forwarded, exactly which ports should be forwarded, and know exactly how to do this, you probably shouldn't. Since port forwarding, with a properly configured and behaving modem/router is not needed by any modern PC game client, you probably shouldn't anyway.
Before ending and in all fairness, it should be noted that some misbehaving or otherwise buggy routers can be 'worked around' by forwarding ports where this would not normally be required. Part of this 'You need to forward your ports!' malarkey is undoubtedly from uniformed users seeing this 'fix' an issue, not understanding that the problem in fact is elsewhere and the 'fix' is a bandage that may cause other problems and security issues. Used properly, this can allow routers that restrict the user to NAT other than Type 1/Cone to mimic a properly behaving full-cone router for a game. This will of course be limited to only one PC on the LAN side and will not allow multiple players to simultaneously play from the LAN if this work-around is needed. See Troubleshooting Multi-Player PC Game Connectivity Issues for examples of this.
I hope after reading this, the reader has a clarified understanding of what port forwarding is, and when its use is appropriate.
Wednesday, March 17, 2010
Troubleshooting Multi-Player PC Game Connectivity Issues
The suggestions are presented in a form generalized enough to prevent the document from becoming a book (and it's already plenty long), but not so much so that they become useless.
Using these suggestions with the details provided, along with resources on the web, your hardware and software documentation and Google should you need to do low-level changes where step-by-step details would have been impractical to provide for reasons of document length or where steps vary wildly for differing equipment, you should be able to resolve your connection problems.
Regardless, do note that this is not for the faint of heart: there is a lot of ground to cover and many subtleties regarding PC game connectivity and issues involved in troubleshooting problems.
You can view the current incarnation of this document at:
Please leave any comments and suggestions here, or via e-mail as shown in the document.