Sunday, December 30, 2007

Xbox 360, PS3 Games Unplayable on Future Hardware?

Earlier I posted about concurrency being a crucial issue in today's and tomorrow's games.

Recently, my son has been playing the original Xbox game Thrillville. The game is solid on our original Xbox, but crashes with disturbing frequency on the Xbox 360, particularly during loading. This leads me to suspect that the timing differences between the original Xbox and its emulation on the 360 exposed threading issues that were present but simply did not occur because of the timing characteristics of the original console. Console game developers generally test exclusively for the single configuration per target platform. The only way we find threading issues, it seems, is through testing. So if testing fails to reveal real issues, games are shipped with them. This is mitigated, of course, by developing games for multiple multithreaded platforms (ie, Xbox 360 and PlayStation 3) with the same code base.

Emulating Concurrent Code

This is one kink in a larger problem. The problem is developing a body of 'literature,' whether games or other software, that will be usable on future machines. Take, for example, the classic game Ultima VII. It is a beautiful and vastly influential role-playing game rendered unplayable by modern machines because of its hardware voodoo. (We also may occasionally need to use software such as old versions of WordPerfect, VisiCalc or Lotus 1-2-3 - and perhaps from non-PC machines - to read crucial, but old, data stored in those formats.) For such situations we typically resort to emulators such as the excellent DOSBox or AppleWin for maximum compatibility.

It is very difficult, if not impossible, to maintain timing-level compatibility in emulators. Older games that relied heavily upon machine-specific timing are now pretty much broken in emulators. For that matter, they're generally broken when the next generation of hardware arrives. Multithreaded code that contains hidden bugs (which is pretty much all code, multithreaded or otherwise) is implicitly and heavily timing-dependent. It's one thing to emulate an Xbox game, with typically very low levels of multithreading on an Xbox 360, and may be quite another to emulate Xbox 360 games on an Xbox 720.

As a result, I suspect that far more Xbox 360 and PlayStation 3 games will be unplayable on future hardware than Xbox and PS2 games on the 360 and PS3.

Thursday, December 20, 2007

Why Do We Use C++?

One of the recurring questions (especially when confronted with concurrency) is 'why do we use C++ for game development?' This entry was extracted from Concurrency in Game Development since it's really a separate topic all its own and that one was overlong anyway.

Why Do We Use C++?

Now before I hear the platitudes about how [insert favorite non-C++ language here] is better anyhow and would fix everything in one shot, let me ask you how feasible it would be to develop multi-million line software on multiple, often brand-new, platforms simultaneously with extreme performance expectations, leveraging mega/gigabytes of shared legacy code and without having to port, develop and/or maintain one's own compilers and toolsets with this superior language. If the game development industry turns to non-C++ languages to solve this, it will be slowly and painfully. And, most likely, it will be an industry-wide move and not just one studio or another, although some must lead the way. Honestly, I would never expect C++ to go away completely (just as assembly has never really gone away), and some future approaches may just be layered on top of C++ (Bigloo, Intel Ct, OpenMP), or use it when performance is critical (Java's JNI).

C++ is flexible and fast. With sufficient (possibly enormous) effort, it can do almost everything any language can do. It can function both as a high-level multiparadigm language and as a low-level portable assembly language. What makes C++ (and C) so widespread is its unrestrictive nature. This is widely seen as a negative, but in the real world, being able to abuse your language to get what you want from your machine is of crucial importance - especially when performance is a primary concern. That said, you may only want to abuse your language say 2-5% of the time. The rest (95-98%) of the time, you'd like some nice, type-safe, bounds checking, memory-managed, interactive, memoizing language, giving you a >100% increase in productivity. That we have settled on C++ indicates that the 2-5% is so crucial that we're willing to sacrifice the rest for it. In the game industry, I think that's a fair statement.

Another, often forgotten strength of C++ and of many traditional modular and modular-turned-OO languages is linking, or more generally, we might say 'package management'. C++ innately offers build-time linking and runtime 'linking' is also usually available (i.e., DLLs). This allows graceful scaling. Tool support for this in C++ is strong and reasonably robust because C++ is so frequently used to build enormous projects. It's possible to build massive projects in pieces in ways that are awkward or impossible in many functional or logic languages. (See comments, some strong points about other languages package handling vs C++'s and Which Languages Handle Packages and Libraries Best?)

Still, writing solid C++ code even in the absence of multithreading requires a mastery of nearly the whole language, making it dangerous for inexperienced developers. Even merely adequate C++ coding is heavily reliant on learned idioms that are not a part of the language, and therefore unenforceable. For example, that you are allowed to return a pointer or reference to a stack-allocated object from a function, or that you are allowed to overflow a string buffer have cost the world untold man-hours and dollars. And yet for all its ills, C++ ranks near the top of the most useful (or at least used) languages ever explicitly designed (including Esperanto).

(4/11/08 - adapted from one of my comments in reddit.)

I should stress again that one of the more crucial issues surrounding language choice is the set of tools provided to us by the console vendors. C++ wasn't adopted until late in the console world because C++ standards weren't well supported by console vendors. Tools have always been poor on consoles compared to the PC (this has changed somewhat with the 360), and good quality C++ compilers were rare in previous console generations, but C compilers were available (athough they, too, came late - with the original PlayStation or Saturn, I believe?).

The PC gaming world hasn't seen this sort of lag. Game developers are eager to adopt new technologies. Other technologies have been unavailable on these platforms.

This is an exploratory post (as they all are..) and is subject to change.

Tuesday, December 18, 2007

Concurrency in Game Development

Effectively developing concurrent software for modern machines with multiple cores is perhaps the greatest technical challenge we'll encounter in some time. Our current approaches just aren't up to the task of creating robust multithreaded code.

Why Do We Use C++?

Please see the linked post for some thoughts on this topic. Before we look directly at concurrency, let's look at what brought us to where we are.

Categories

It’s reasonably well established that categorization is a fundamental human strength. You might say that feature extraction and classification are hardware-accelerated in the brain. Even at levels far below the conscious (i.e., the visual cortex) information is categorized before it even becomes 'thought' to us. In fact, categorization is so fundamental to human thought that assuming categories themselves are real objects has been a universal illusion. In “How the Mind Works”, Pinker states that nearly all cultures initially adopt a ‘folk idealism’ as a result of this. Applying this to the boundary of man and machine communication, Object Orientation has evolved as a straightforward way to map categories (type, class, etc.) and instances of those categories onto machine architectures that deal primarily in a few primitive and largely undifferentiated numerical types.

Abstractions are complexes of categories and their interactions. Understanding complexity in terms of hierarchies of abstraction is something that people do really well. Modular Programming and, to a greater extent, Object Orientation attempt to give us tools to work at these levels of human competence - with, of course, some consequences in terms of final performance.

The predominant programming paradigms attempt to map the way people think onto the way machines operate. Let's turn our attention to concurrency and see if we can stretch this a bit further.

Concurrency as Time vs. Space

In software development, one of the things I’ve noticed is that people are much better at understanding space than time. This is why we map out time in timelines, MS Project files and a million other ways in spatial form. What this means to programming is that anywhere you can map out state in terms of space, the result is far easier to comprehend. For a clear example of this, see Google's MapReduce.

This is the core issue with concurrency. It’s very difficult to understand the possible states of concurrent systems because they happen in time. Many of the abstractions that can help with concurrency (such as Functional Programming, Message Passing, etc.) are useful because they essentially transform a state-heavy process into some equivalent but more understandable spatial map whose design appears much more static.

Open Questions

In some ways the game industry is at the vanguard of multicore development on consumer machines with the Xbox 360 and the PlayStation 3. The amount of time and effort that we spend finding and fixing multithreading bugs is terrifying. While the hardware companies move toward doubling the number of cores with each processor generation, software companies will be reeling.

Clearly, C++, as it is currently, is not well suited toward developing software on highly multicore machines. Ideally, we’d have a language, extension or paradigm that would allow us to map thread concurrency onto easily understood (that is, spatial) language constructs that discourage or prohibit the kind of multithreading errors that currently waste hours and hours of our time. Right now, we don’t have to worry how many registers there are on the processor when we write C++ code. Similarly, whatever language/paradigm we’d want to use would, at the compile stage, optimally generate code for the number of cores available to it on the target platform.

So would some sort of functional language be best? C++ with functional extensions? Erlang? Haskell?

What about programming models based more on hardware description languages such as VHDL and Verilog, which are inherently spatial? Would they map more effectively to multicore machinery since hardware languages describe processes which are inherently asynchronous?

In any event, we software developers have an interesting road ahead.

Wednesday, December 12, 2007

Times of Lore (maps)

A few years ago I wrote a program to spit out the maps from the PC version of the 1988 game Times of Lore, which I loved as a youngster. Where there were cities mostly show up here as bright green grassy areas. Thanks to Andrew Schultz and his site for inspiring me to dig up and post these images. Oh, and check out Kenn Rice's site too.

The world -


The dungeons -

Friday, November 09, 2007

The Impossible Dream #4

(Updated 11/23/2007)

Where Have I Been?

It's been months since I've blogged. They've been productive months, though. See, in early October, another beautiful baby girl joined our family! It's been a rush having another child. It's incredible how you can still be completely awestruck as if it were the first time, each and every time. :)

Prior to that, vacation.

Because I had some time off, I slipped in some hours on my Impossible Dream. I hadn't really considered blogging about it yet until I stumbled across Project Steel & Glass at The Cluttered Desk that motivated me to write an update.

Before leaving on vacation this summer, I'd made some headway with Pair, my embedded script language. Especially, I'd made some strides compiling it to more efficent bytecode, etc. It's in a sort-of usable state. Still, some things like lexical closures are not 100%. I'll leave blogging about that until later.

Basically, much of the material pass (no illumination yet) of the game is rendering. It looks pretty raw and repetitive with respect to content right now. There's not yet much differentiation in city sections. No niceties such as shadows or even basic lighting yet. And, jeez, how am I gonna do trees? It's time, though, to switch gears for a bit and prototype the gameplay. As you might infer from the screenshots, the protagonist will be capable of flight. I've got my Xbox 360 controllers hooked up to my PC and I can fly all around town.

(Zaragosa, Mexico [fictional])


(Puebla, Mexico [real])


Please, I'm Just One Man!

So I must enlist the machine to generate a great deal of content automatically, albeit offline. This approach, of course, runs the risk of a high degree of repetition, but I think with some caution and tool refinement this will turn out great. I'm really happy with how auto-generation has gone so far. It's actually starting to vaguely resemble the residential city streets of central Mexico, which is, of course, the idea.



Anyhow, I'm pretty pumped about it. Given some time (lots) and little-by-little refinement, we may have something pretty cool here.

Tuesday, July 31, 2007

NCAA 08's Video Highlights

I'm pretty proud of this one. ;)

Thousands of user-created videos from NCAA Football 08!

http://www.easportsworld.com/#canvas_TopVideos

I posted this on 4/12/07:

I've been working on a little something special in NCAA Football 08. An NCAA Football 08 First Look article by GameSpot details the surprise. So now that it's public..

".. Once you've found a highlight you're particularly fond of, you can then choose to create a highlight of that play. This lets you choose from multiple camera angles to take a snapshot or segment of game-generated video. From there, you can either save these photos or videos on your hard drive or upload them to share with friends. .."

About the middle of last year, I was asked to do some R&D for this technology. Not long after that we were building it. Still, it came really close to being cut on a number of occasions. There were times when it seemed impossible to satisfy all of the below constraints and still produce a video of acceptable (or any) quality. I had taken some time off for a bit of surgery as NCAA was finalling and came back to find some bugs in my .flv encoder. So, I was a little paranoid that some random, unforseen issue might render the whole feature unusable in the shipped game.

Constraints

There were some wicked constraints putting this one together including -
  • Encoding while rendering the game at 60fps/X360 and 30fps/PS3 without dropping frames.
  • Keeping final file size small enough to not overwhelm the servers and to keep our bandwidth costs down.
  • Final visual quality given the other constraints.
  • Memory during encoding. Early in the cycle, I gave a rough estimate of 25MB free memory in-game to encode to .flv. It turns out we had to do it in under 500K in-game, and we did that by offloading some tasks until later and some other tricks I won't - and shouldn't - go into ;).
Compromises
  • No sound. This one's hard to take but we just didn't have time. Remarkably, on the web, I haven't seen anyone complain about this. For next year.
  • 15 frames per second. Yes, we wanted thirty, but we could only do that with a smaller frame size. Believe me, the trade off was well worth it. Also for next year.
  • Video dimensions. We would have liked a larger frame size but we just didn't have the machine resources or network bandwidth to spare.
All in all, I think we struck a reasonable balance and it works. It's exciting to see people this jazzed about it! This technology is now quite the buzz around EA and next year we'll be seeing it in a lot more titles.

Will this technology and its descendants usher in the Web 2.0 era in console sports gaming?

I hope so!

Update Jan 3, 2008

Check out #2 on this list at pastapadre.com.

.. Launching with the release of NCAA Football 08 [EA Sports World] was an immediate success as fans of the game utilized the screenshot and video uploading features and it became one of the most positively responded to new additions to any sports game in recent memory. ...

For video highlights and optimizations geared at getting our football games to 60fps on PS3 starting with NFL Tour, I was nominated for the 2007 Outstanding Achievement in Engineering award for EA Sports - Tiburon. :)

Sunday, July 29, 2007

Trampolines

Developing embedded languages for applications in the past, the most time consuming part always seemed to be wrapping the multitude of foreign language (ie, to/from C++) function calls of interest with script-compatible functions.

One of the goals of Pair is to make working with other languages - especially C/C++ - very easy. So here we'll take a different approach. This time we'll generate trampolines to call foreign functions (for example, from a DLL) and do the boxing/unboxing of values for Pair to use. Trampolining has a number of different meanings but many of them center around the idea of generating code at runtime to call a function with special requirements that aren't known until runtime. In this case, we're directly generating the machine code necessary to set up the stack frame and call a function using either the __cdecl (for the C runtime library) or __stdcall (for the Win32 SDK functions) calling conventions.

This allows us to do things like the following -
(let
((beep
(import "kernel32.dll" stdcall uint Beep (uint uint)))
(system
(import "msvcrt.dll" cdecl uint system (string))))
(system "dir c:\\root")
(beep 500 1000))
C++ code -

nativecaller.h
nativecaller.cpp

Tuesday, July 10, 2007

Nearest Neighbor

What's the fastest way to find the closest pair of three-dimensional points in a large set? That was the question posed for the most recent Hacker's Delight challenge.

We just released the results for Hacker's Delight #7: Nearest Neighbor. A little premature, it turns out, we misplaced a bunch of the entries. Once the remaining entries were found, I ran the tests before I came home tonight and Jim should be putting them up soon. It turns out that the missing entries don't really affect the outcome.

This is the third Hacker's Delight challenge I've blogged about. Challenge #6: 64 Choose 4 came from the post Let Me Count The Ways and, in full circle, Let Me Count The Ways (Part II) comes from the challenge. From Hacker's Delight #4: Primes came Sieve of Eratosthenes.

Alistair Milne from EA Montreal blew Nearest Neighbor out of the water with a blazingly fast and beautiful algorithm for finding the nearest two points in a cloud of points in three dimensional space. For comparison the very slow reference implementation is here. I wondered if I could get any more performance out of Alistair's algorithm and came up with an uglier but ~25% faster SIMD optimized version of his algorithm. Sadly, my own reasonably fast implementation for Nearest Neighbor is incorrect about 7% of the time.

My implementation, and many others we received were based on the canonical Nearest Neighbor algorithm as described in Cormen's Introduction to Algorithms. It seems that in every respect, Alistair's algorithm is superior. It's more elegant, it's much faster and it does less work.

Be sure to check it out.

Saturday, June 30, 2007

A Developer's Life Stages

I ended up writing a novel as a comment on Jamie Fristrom's blog post on Speculative Generality. Read his entry. I'm reposting my comment here.

Developing The Simplest Thing That Could Possibly Work

In Forrester Research's 10 Mistakes in Software Development, #3 is overscoping a solution and #9 is jumping into development without enough research, and we've all done it. :(

A friend consulted at a large transportation company in the Kansas City area. They'd been gearing up for a massive conversion from their existing AS/400 systems to a completely Java-based system. Remarkably, in the early stages of the project, they decided to wrap the entire JDK in their own custom framework and decreed that all developers use these wrappers exclusively. As a result of this, they obtained -
  • little or no added functionality,
  • more limited abstractions than the raw JDK,
  • minimally tested code underlying everything,
  • the requirement to train all incoming Java developers on their framework and,
  • worst of all, the obligation to maintain many more thousands of lines of code.
I've observed that developers tend to progress through a number of stages in their career. These are like the stages of grief: inescapable. This is because each stage produces a set of skills and philosophies that are eventually synthesized in the seasoned software developer.

The Underengineer

The underengineer is elated at his ability to get things done quickly and be productive, disdaining others (usually the overengineer) for what they see as excessive rumination and for producing an excessive amount of code for a given task. You can count on these guys to get stuff done. Will it be done 'the right way?' Only if by accident. Underengineers are often unable to discern 'the right way,' often because they've spent little time reading and maintaining other people's code.

A developer may begin to come out of this stage when he tires of fixing or writing the same code (in different places) over and over again. Or perhaps he'll be propelled into the second stage by an enthusiastic reading of Design Patterns or a sudden grasp of UML.

The developer at this stage is fascinated by accomplishing the task at hand and it is this stage in which he learns how to code and debug.

The Overengineer

The overengineer knows that he can solve any single development task and has become fascinated by possibility. Abstraction is magical. How can any piece of code be used to solve multiple tasks? Paradoxically, this usually ends up producing more code for any one task. While the underengineer shakes his head at this, the overengineer knows that his approach will produce much less code for an entire set of similar problems. Meanwhile, the overengineer looks at the underengineer's hackery with contempt. The question the overengineer rarely asks himself, though, is will there be a large enough set of similar problems to justify this effort? And so he abstracts everything in the name of flexibility.

The overengineer is sometimes shocked to realize he's become much less productive in terms of functionality than before. He can't seem to get anything done! Still, he produces as much (if not more) code and suddenly becomes capable of building much larger systems than he could as an underengineer.

This is the stage in which the developer learns architecture and the value (and painfully, the costs) of abstraction. This is the stage of the architects of the transportation system conversion discussed above.

What kills the overengineer? Maintenance. Dependencies. Broken abstraction. Realizing that the flexiblity he built in is unused and real systems change in ways he could never have anticipated. At some point, and probably many, the overengineer will get a phone call at 2am asking him to come in and fix an issue that breaks his whole architecture. After a few of these heartbreaking moments, his world view begins to change again.

The Seasoned Developer

The seasoned developer is skilled in coding and debugging from his time as an underengineer and is a skilled architect from his time as an overengineer. What's more, he's come to understand the following:

The primary reason for abstraction is to simplify implementation for a given set of requirements.

If abstraction doesn't simplify, ditch it.

The seasoned developer also realizes that the solution domain for a given problem may extend beyond technology. People often try to use technology to solve social or organizational problems. Conversely, people frequently attempt social or organizational solutions for essentially technological problems. Recognizing these incongruities before they become a software design and implementation can help to avoid doomed projects.

The organizational problem in the transportation example above is that software architects were assigned to do a job before anyone knew what job they were to do. The proper response would be to do nothing - or better, to work on a different project until the problem domain was fully understood.

Instead they chose speculatively general busy work, incurring a much higher cost.

Friday, June 29, 2007

The Future of Processor Hardware

Many-Core Processors, Languages for Multiple Processors and Reconfigurable Computing


Many-Core Processors

Multicore processors are giving way to many-core processors with Intel estimating the number of cores doubling with each processor generation. The paradigms of the most commonly used programming languages are ineffective with respect to these types of processors. So what will happen?

Languages for Multiple Processors

Modified C++/Java# with futures, promises, (Flow Java)?

Functional languages are inherently more parallelizable than imperative languages. Will we see a resurgence of these types of languages?
- Concurrent Haskell/Concurrent Clean
- F#

High-Performance Reconfigurable Computing

Hybrid systems combining conventional (Von Neumann) architectures with FPGAs and other forms of reconfigurable hardware are starting to mature, but with radically different models of "software" development.

Is High-Performance Reconfigurable Computing the Next Supercomputing Paradigm?

From Wikipedia

Paradigm Shift

It is called the Reconfigurable Computing Paradox, that by software to configware migration (software to FPGA migration) speed-up factors by up to almost 4 orders of magnitude have been reported, as well as the reduction of electricity consumption by more than 1 order of magnitude - although the technological parameters of FPGAs are are behind the Gordon Moore curve by about 4 orders of magnitude, and the clock frequency is substantially lower than that of microprocessors. The reasons of this paradox are due to a paradigm shift, and are also partly explained by the Von Neumann syndrome.

Tuesday, June 26, 2007

EASTL

EA genius and colleague Paul Pedriana published a paper to the C++ Standards Committee detailing his EASTL, an EA version of the Standard Template Library that provides a number of efficiencies with game development in mind, but that are nonetheless applicable across other software domains.

I've always been a fan of the STL in general, and since I've been at Electronic Arts, of EASTL. It's nice to see some of these innovations get out into the world.

Thursday, June 21, 2007

Spread This Number?

AACS encryption key controversy

This bad boy?

13,256,278,887,989,457,651,018,865,901,401,704,640

So where was I when all the noise about this was going on in April? Ah, illegal numbers. What will they think of next? Anything that can be represented digitally can be represented as a single (usually very large) number. Software, movies, music, etc..

That reminds me of a story I read once about a civilization that encoded all its existing knowledge as a single mark on a stick. The position of the mark on the stick divided by the length of the stick yielded a decimal number less than one. The digits of the fractional part of that number represented the totality of the civilization's knowledge in an encoded form.

Could this work?

Let's think about this. A proton has a diameter of about 1.5E−15 m. That means we'd get only about 16 decimal digits before we're measuring sub-subatomic dimensions. And Shannon tells us it would take lots of digits to represent that knowledge in any reasonable form. Actually, it would take about 2.3 million decimal digits per megabyte of data. And we're talking at least trillions of megabytes of data.

So no, there is no way - now or ever - that this could work.

Thursday, June 14, 2007

Horrific Default Behavior/Failure Modes

(Updated 7/10/07)

Horrific Default Behavior

Today I wrote a batch file called clean.bat and put it in a directory on a different machine. It contained -

del /f /s /q *.obj
del /f /s /q *.ilk
del /f /s /q *.pdb

I had the directory up with Windows Explorer and '\\machine_name\c$\dev\..' was in the address bar. I saved the file into the directory of the other machine. Pretty straightforward, right?

So, thoughtlessly, I double click on clean.bat. Without ado, the batch file starts executing with this message -

'\\machine_name\c$\dev\..'
CMD.EXE was started with the above path as the current directory. UNC paths are not supported. Defaulting to Windows directory.

C:\WINDOWS>del /f /s /q *.obj


Holy crap! Fortunately, I saw it immediately and broke out of it. Even more fortunate, I don't think C:\WINDOWS has any .obj files.

But, if I had wanted to clean .exe files instead (which I frequently do), my machine would have been utterly hosed.

That, my friends, is horrific default behavior.

Failure Modes

Last year, our water heater died. It was pretty old, I think. The problem was the way that it died - by catching fire. That was pretty disturbing.

So the next day, while I was at work, the repairman came out to replace it. Rebekka asked him why it caught fire. The repairman explained, "that's how you know it's broken."Really? The next time my water heater fails, I'll know by following the fire trucks home to see my subdivision in cinders. "Time for a new water heater," I'll say to myself.

And how will I know when my nail clippers need replacing? Oh that's right, from the mushroom cloud over Orlando.

Wednesday, June 06, 2007

Compiling Haskell

I've played with the Glasgow Haskell Compiler only a bit. Compiling Haskell to, say, native code is quite a chore. The GHC compiles to a number of special intermediate languages between source and machine code.

Haskell > Core (Hindley-Milner typed lambda calculus) > STG > C/C-- > native machine code

That's a lot of work.

Small Tools

(Updated 7/10/07)

Part of the purpose of this blog is to remind myself of best practices I've discovered through the years I've been developing software. One of these best practices was brought to mind again in a recent conversation with a colleague.

Small, Combinable Tools

Unix, of course, is built around this philosophy. In essence, it is an extension of the practice of modular programming to applications. It allows us combine the functions of applications in an automatic or semiautomatic way. These are generally command-line tools, which may make them inconvenient in certain circumstances. Nevertheless, with a judicious layering approach, it is frequently possible to capture the best of both worlds - command-line combinable tools underneath, user-friendly GUI on top.

UnxUtils

If you use Windows and you develop software, it is absolutely imperative that you download and use UnxUtils. They are native Win32 ports of many standard unix utilities, and they can be extraordinarily useful. Add them to your PATH. Oh, and if you're not comfortable in unix and, by extension, with these utilities, let me encourage you strongly to learn to use them well. You'll thank me later!

Windows SysInternals

Windows SysInternals. You may not need these beauties often, but when you need them, you need them badly. I believe that few (or none) of these are command-line operable (I may be mistaken).

Windows Utilities

Junction for symbolic directory links in Windows. Get it.
PathMan for path management, and a ton of other good stuff here.
Unlocker, a fabulous (GUI) utility for unlocking files, by Cedric Collumb.

Build Utilities

Ant and NAnt for builds.

Freeing pov2mesh

My pov2mesh utility is currently distributed under the shareware model. I've pretty much decided to release it as freeware. If I do this, I will refund the registered users and remove the registration code from the application. Refunding will be easy, because there are so few registered users! I will not release the full source now, although I may in the future. I've also developed a number of other small tools that I will slowly be polishing (a little, not much) and releasing here sometime.

ASPack

($29) No, it's not a fanny pack. This executable packer is phenomenal.

Not So Small Tools

UltraEdit

($50) My primary text editor. It loads fast and I love its Find/Replace In Files. TextPad's a close second.

LTProf

($50) An easy-to-use, very lightweight profiler.

Araxis Merge

($129) I don't know how I'd live without it.

MilkShape

($35) Basic, inexpensive low-polygon 3d modeler than can import/export almost anything.

Ultimate Unwrap 3D

($50) Unwrap is great for texture mapping.

Paint Shop Pro

($90) Of course, as every game and web-developer who prefers Paint Shop Pro to Adobe's heavyweight (and expensive!) Photoshop knows. Now that Corel owns it, hopefully it will fare as well and not become as bloated (or expensive!) as Photoshop.

Gnu Prolog

Need a good Prolog implementation for expert system development or curiosity or anything else?

In the rare moments when I use Prolog, I use GNU Prolog.

Thursday, May 24, 2007

The Impossible Dream #3

This is, of course, going incredibly slowly and that's just alright.

Script

I'm exploring an interesting architecture. I'm starting to use my script language Pair to script core parts of the game. Pair is Lisp/Scheme-like and very compact (the VM executable [compressed] is 28k without many external functions!). I designed Pair a few years ago with gaming in mind. The original intent was to use it to script many (hundreds or thousands) simultaneous events without having to make everything a state machine. Pair is multithreaded at the virtual machine level and not at the OS level, yielding very lightweight threads. To accomplish this, an explicit frame stack is maintained for function calls, and they are not implemented as C++ function calls underneath. Like Stackless Python which takes a similar approach (and is also used in gaming for its facility in concurrency), it means that Pair can (and does) support continuations because the call stack can be directly manipulated. One of these days I'll put up Pair and source.

On the Side

In the 80's in front of a house in the Kansas City, Missouri area a man built a boat. It was an incredibly tall half-built steel boat in this guy's yard. Every once in a while we'd drive by and see people working on it and a bit of progress here and there.

One day, my dad decided to stop and talk with the guy, with us little kids in tow. My brother and I took the opportunity to see the beast up close. And it was freaking amazing. To us, it might as well have been a Saturn V.

Many (and many of the best) software developers I've known have had little (or large!) projects on the side. Interviewing developers, I've sometimes asked about their side projects. It's a question that can yield great insight. Where does their passion lie? What problems attract them? What someone does with free time is what they really want to do, regardless of what they might let on in an interview.

It's an indicator of passion and of a drive to create. I've recently come to understand that these side projects also fuel the passion for our day-to-day work and inoculate against burnout. As developers, we have the power to create something from nothing. For some small portion of our time, to follow our own muse can be an exhilarating act.

Some companies, notably Google, have institutionalized this idea, giving developers a certain percent of time during office hours to work on their own projects. I'd expect Google to reap a significant innovative advantage with its policy.

What compels about software development is only partially solving puzzles and, for me, is about the joy and power of creation. Unfortunately, this power is, in part, illusion and the illusion is intensified by software's abstract nature. It nearly always seems far easier to write software than it actually turns out to be.

I'll demonstrate this illusion. Which requires more effort in man-hours: building a bridge across a medium-sized river or developing a game such as Spider-Man 3 or Madden 07? How about the Empire State building or Windows XP? I don't know the answers to these questions, but I don't believe that I can fully trust my intuition to provide them. In any event, good software requires enormous, and univerally underestimated, effort.

Maybe one of these days, I'll have another kooky neighbor, this time toiling away in his garage or barn building a Space Shuttle or something like in that goofy movie "The Astronaut Farmer" (which I have not yet seen). I'll know and he'll know and the fire department and the old ladies from the HOA will all know that it will never lift off. And I'll sit in the garage with that guy, and put the LOX and kerosene tanks together. And it will be freaking amazing.

(Updated 7/11/07)

(Rebekka and I watched The Astronaut Farmer last night and now I wish I hadn't referred to it. I'd hoped for something good, like Apollo 13 or October Sky but Astronaut Farmer was godawful. The premise was utterly absurd and the profoundly self-centered protagonist Charlie Farmer reminded me of an Allie Fox (from a great movie
The Mosquito Coast) barely disguised under a veneer of sentimentalism. The best part? The rocket launches from their barn. As in a barn made of wood and filled with hay. And speaking of wood, during one of the bloopers, Billy Bob Thornton asks, in jest, if this was a film directed by Ed Wood. I was wondering the same thing.)

Saturday, May 05, 2007

Fun, Fun, Fun

..until her daddy takes her GTA away?
One of the things that have become even more abundantly clear in the transition to next-generation consoles is how the concept of fun is quite divorced from what we consider a quality game. Quality games must be fun, of course, to a point. They don't have to be too fun, though, and originality in this arena carries an enormous and intolerable financial risk. High-end games must be beautiful to look at, slick, contain vast quantities of content. Frequently, they require online experiences. These types of games must sell millions to recoup development costs. Because risk is so enormous, there is a lot of copycating of successful models.

Let me provide an example. I loved GTA III, liked GTA Vice City, found GTA San Andreas boring and True Crime, Saint's Row and even Crackdown dull, dull, dull. Fun usually requires novelty (or nostalgia), and there's no more novelty left in stealing cars and ramming them into other cars and pedestrians until they explode (the cars and the pedestrians).

Of course, this is something that everyone involved with games knows. But sometimes it's useful to state what's already widely understood.

(Updated 7/10/07)

Grand Theft Auto IV

Ok, I know what I said about GTA-type games getting boring. Still, GTA IV looks awesome. And of course I'll have to play it.

If I get bored by it I guess I'll only have myself to blame..

Wednesday, April 18, 2007

HDTV's Golden Ratio?

Is the 16:9 aspect ratio of movies/HDTV (a ratio of 1.777..) more attractive than the 4:3 of Standard Def (1.333..) because it is closer to the golden ratio (1.6180339887..)?

Rebekka says she believes that it's intentional. At first I didn't think so, but now I agree. The golden ratio is a crucial concept in architecture, the visual and musical arts, so it would make sense.

Thursday, April 12, 2007

Senzee's Developer Interview Test, Q. 1

You have two hours to complete this exam.

1.) Without looking at the following questions, quickly estimate how long it will take you to finish this test correctly. Please do not let your estimate exceed the time limit given for this test.

..

..argh..

Friday, April 06, 2007

Moving Parts

This is where I intend to catalog (all?) the components of a commercial game. I will periodically update this post.

Why?

I've bought and read many books on game development. The bad ones are generally 1000 pages of C++ source code. The good, useful ones tend to fall into two categories.


  • Narrow and deep (books that equate 'game engines' and 'rendering engines', for example)
  • Broad and shallow (books that contain light sections on rendering, sound, networking, data structures, etc.)
Still, even books in the broad category aren't that broad. The reason? Books can only be so thick. My point really is that the practice of game development includes a vast array of subspecialties and I'd kinda like to enumerate them here. Hopefully with input from my readers (that's you, Mom) that I'll add. So this list will be ultra-broad with no depth whatsoever. Depth will have to wait for subsequent posts.

Now, not every game will have or need all these systems, but it's crazy just how many most games will need. So, for right now, this is a brain dump and I've only just begun..

Offline

Content Creation - tools - geometry, mesh creation
Content Creation - tools - geometry, mesh creation - offline CSG tools
Content Creation - tools - image creation
Content Creation - tools - motion capture cleaning
Content Creation - tools - higher level 'world' assembly, including assigning gameplay attributes to objects
Content Creation - tools - sound creation/mixing/production
Content Creation - tools - mission/level creation
Content Creation - tools - statistical data, skill/inventory data and attributes
Content Creation - tools - state machine creation
Content Creation - tools - script debuggers, interactive interpreters for testing
Data Management - managing generated code
Data Management - asset dependency management
Data Management - geometry processing - ie. adding attributes, triangle stripification
Data Management - texture processing - texture packing, format conversions, bit depth conversion, etc.
Data Management - asset databases (SQL, XML)
Data Management - metadata collection about assets, data mining for global performance enhancement, data mining for content creation automation, etc.
Data Management - version control systems
Code Management - Code Generation - from asset databases, other offline sources
Code Management - Build System - standard way of building the game for all developers
Code Management - Library/Component Management - system(s) for cataloging, versioning and incorporating libraries and reusable components
Code Management - version control systems
Testing Systems and Methodologies


(much more offline/pipeline side to come; the game pipeline gets far less attention than it deserves, often with disastrous results)

Runtime

System - threading libraries/primitives
System - error handling, logging, assert handling
System - string handling, regular expressions
System - elementary data structures, STL, etc.
System - fast math primitives
I/O - File - standard file i/o
I/O - File - file compression/decompression
I/O - File - loading game assets (meshes, textures, scripts, game data files, etc..)

I/O - File - low-level game asset caching
I/O - File - asset 'packaging' into larger files
I/O - File - object serialization support
I/O - Network - lobby
I/O - Network - server side
I/O - Network - event handling (see Events below)
I/O - Network - publishing player data to online services
I/O - Network - web access services
Localization Support - text databases
Memory - efficient game asset caching
Memory - memory pooling systems
Memory - garbage collection systems
Events - event handling/dispatching systems
Events - 'replayable' game state management
Events - game state represented as loadable/saveable data stream
Events - game state policies - random/deterministic, etc..

Events - game 'flow' management
Sound - Mp3, WAV, Ogg, etc. playback
Sound - runtime mixing, filters

Sound - in game capture
Sound - stereo, 5.1 surround sound
Sound - 3d audio
Input - console controller
Input - force feedback (output)
Input - spatial (Wii, SIXAXIS)
Input - mouse
Input - keyboard
Input - other (Guitar Hero, EyeToy, DDR, Donkey Konga)
Graphics - rendering - geometry processing
Graphics - rendering - geometry processing - progressive meshes and LODs
Graphics - rendering - geometry processing - efficient spatial subdivision
Graphics - rendering - shader(s), per pixel lighting

Graphics - rendering - scene graph management
Graphics - rendering - GPU based utilities for non-graphics
Graphics - rendering - geometry processing - in game CSG operations
Graphics - geometry processing - human animation

Graphics - geometry processing - motion capture
Graphics - rendering - decaling
Graphics - rendering - particle systems
Graphics - rendering - dynamic skyboxes/skydomes
Graphics - images - loading/encoding image formats - DDS, TGA, JPG, etc.
Physics - collision detection
Physics - collision detection - geometry processing
Physics - collision handling, mesh processing for non-rigid object effects
Physics - ragdoll animation
Physics - wind, environmental effects
Gameplay/AI - script
Gameplay/AI - state machine systems
Gameplay/AI - classical AI algorithms
Gameplay/AI - events - collisions, sphere of influence effects, NPC line of sight, NPC earshot

Gameplay - Cars - Damage Models
Gameplay - Cars - Driving Models
Geometry - Modeling

Geometry - Inverse Kinematics
Geometry - Rigging/Tagging/Markup
Geometry - Animation
Geometry - Procedural Animation (ie. Cloth, Hair, Jelly/Spring Models, etc..)
Geometry - Facial Performance and Lip Sync'ing
Rendering - Textures
Rendering - Shaders
Rendering - Animated textures/maps
Video - codec(s)/playback - MPEG, MJPEG, WMV, VP6, Bink, Madcow, etc..
Video - color space conversion
UI - rendering
UI - event handling systems, architecture and data
UI - script
UI - some games have a large amount of UI (ie. Madden, with its enormous UI)

UI - HUDs
Data Management - embedded SQL database (not just for plain-old IT anymore)
Embedded Control - script languages
Control - in-game gameplay tweaking utilities, runtime assets updating support


(more..)

Online Game Services

Online - lobby
Online - server side
Online - (deterministic) player position prediction and tracking (online multiplayer)
Online - publishing player data to online services
Online - web access services
Online - billing tracking for online services (online multiplayer)

Online - player data storage
Online - accessibility from PC website

(more to come, but I'm sleepy now and I want to go to bed..)

Wednesday, April 04, 2007

Optimal Bytecode Interpretation

I always feel like I should blog more about engineering related stuff, since that's what I do all day.

For now, my earlier plans for bytecode to C++ compilation are on hold as we're looking into more immediate gains from aggressively optimizing the interpretation. One of the downsides of the hybrid C++ compilation/bytecode interpretation approach that I have in mind is that it will require a workflow change. The C++ will eventually need to be compiled and then linked to the executable. There's a pipeline/workflow bubble there that's not appealing. Still, I think it would be fantastic to get that built for critical core libraries of script (utilities) that don't change much.

Optimizing Interpretation

Our bytecode is that of a dynamic, and (as a result of the structure of the instruction set) terribly slow language. Unfortunately, we use an off-the-shelf compiler, so we can't infer much that was in the original source but fails to show up in the bytecode output.

Did some memoization, in this case caching of variable and function lookups, with excellent (awesome!) results.

A clever universal hashing scheme for both special strings and user strings opened up a lot of possibilities.

Close attention paid to string handling yielded good results.

Thread synchronizaton primitives were killing us. We usually try to keep our assembly language to a bare minimum for portability sake. This translates to a handful of primitives to improve PS2 performance. But to rid ourselves of our most time consuming locks, we applied asm to a couple of critical spots where we need to read in, modify, then write out whole cache lines as atomic operations on the PowerPC machines (PS3, 360). This avoids the expense of a more heavyweight lock such as a mutex in those places and gives us back a lot of performance for multithreaded operation.

Anyhow, we've significantly improved the bytecode execution performance, which is something the game teams have been asking (begging!) for for ages.

Of course, it's never fast enough.. ;)

Friday, March 30, 2007

Language Vector

"What cannot be spoken of cannot be defeated."


My head isn't clear and I pass in and out of sleep. I'm infected.

Stay with me as I type this. I'll give you a little background. In the 1950's the last of the South Fore engaged in ritual necrophagia and the prion disease Kuru recently claimed the last victims among them.  The last that ate the thoughts of another.  Nearby, another Papua New Guinea tribe discovered a more pernicious and far more horrifying threat about forty years ago.  For years, their population shook and trembled for a reason unknown to them.  They pointed to a cave in which words approximated by the English phrase The Black Whisper of Death were inscribed. "The plague came from in there," they said.

8-43 0x12a_rr . ;09]]

Globalization efforts brought the whisper heard round the world.  Oh, look! On the overhead TV, that image..


A beloved American religious leader at a political celebration passes an attractive, conservatively dressed young woman. As the light of recognition flares in his eyes, he warmly greets her by name. She smiles mechanically. Suddenly horrified he whirls around to locate the camera capturing an unmistakable expression of humiliation.

It's on TV every other day.  She was a minor porn star.  It was not that he greeted her that condemned him.  He could have legitimately known her in a number of ways.  Even had he known her biblically it would have been taken in stride.  Who cares?  It turns out his televised shame indicted him more decisively than a living-room stash of indecent DVD s.  After all, it is shame and little else that condemns us.  The tabloids carried that image for months next to the articles decrying Death's Whisper.  It doesn't matter.  The young woman and the religioso are dead now, victims of their own fascination with their tabloid fame and more directly, victims of its adjacent coverage.

In this case it is knowledge that destroys. dsf6 -saf ;; This is not a disease borne by a bacterium, a virus, or a prion. See, this whisper is a disease carried by the most insidious of vectors - language itself. To simply understand what it is is to be irreversibly afflicted. What cannot be spoken of cannot be defeated. I am condemned already and by writing I damn my readers. But there will be no readers..

Some said it was the Creator, dropping in a kill switch to flip at the turn of a phrase.  It's adaptive. They named the disease in the native tongue and it killed them. Scientists described it in English and French and Spanish. It killed them. The blind read of it in Braille, the deaf signed in ASL. Experts from around the world spewed a rainbow of euphemisms. It killed millions. It is not bound to any particular lexical form - perhaps its representation in the brain creates the disease ex nihilo. I don't really know. We did not discover the mechanism that drives this epidemic.

It may be strange to think of language spreading disease, but it is not unique.  What is it that viral machines inject into living cells if not RNA/DNA coded assembly language pathogens?  I know that machines aren't affected because words are transparent to them. Still, machines played the pivotal role in the spread -&89sd(( of the disease as its immune and efficient carriers.

Can you imagine (no, you can't - because there is no you) seeing all those around you die for an unknown cause of death?  As death certificate handlers really understood the meaning of unknown cause of death they came down with the disease in droves.  Suddenly, the certificates began listing things like Onychocryptosis and Internecivus raptus.  Those who witness the deaths but don't know the cause are inexorably driven to discover it.  (Most of) those who know, and are infected, try desperately to not reveal it to anyone and have always failed.  Now though, strangely, I'm compelled to type this even if I have to type with the keyboard upside down to keep the blood from my hands from running into it.  I can no longer hope for any resolution but the obvious.

And I wait for that resolution even as the asymptomatic machines auto-hyperlink this document. No future, fatal question will go unanswered action diminishes c0ntracts freedom entered invalid through fear judged excuses penalty attached contrary vectpr for reflex arc st1mulus unexpexedly 0kers afwe ~pwnd xasl gatattaccaagggtc link <> link 032 . . .

(story copyright (c) 2007 Paul D. Senzee)

Wednesday, March 21, 2007

Bytecode to Native Compilation

At work, some of the software I develop uses a bytecode interpreter. And, as always, we need better performance from the whole system. So I'm looking into bytecode-to-native (in this case C++) compilation. I've done this before, with embedded Lisp-based languages and there are a number of compilers available that do this for Java (GCC has a back-end for this), C#, Lisp and its derivatives (Bigloo for Scheme, for instance). Compiling to C or C++ is great as it serves as a sort of portable assembly language and it's possible to leverage further the fine optimization skillz of modern C/C++ compilers. I'll report on this when I make some progress - if I don't get pulled off onto something else.

Wednesday, February 28, 2007

Jet Engines

Creating high-performance technology in a competitive field? Managing your developers' weaknesses and ignoring their strengths may cost you.

Let me illustrate.

Iron accounts for 35% of the Earth's mass. And from iron comes carbon steel (iron and carbon) which accounts for 90% of steel production. As metals go, steel is cheap, abundant and easy to work with. Steel is versatile and good at most everything metals do. If steel was your employee, it would be a damn fine one. You might want all your employees to be just like steel. After all, compared to other materials, it has few weaknesses - but also few extraordinary strengths.

Steel can't do everything. It distorts at extreme temperatures. Steel is not brittle like ceramics but it's also not nearly as hard. It is much cheaper than titanium but weighs almost twice as much. It is a poorer electrical conductor than gold. And of course, steel's strength and hardness in extreme conditions doesn't hold a candle to the single crystal nickel-titanium superalloys* from which they carve jet turbine blades - albeit at enormous cost.

Every one of these other materials demonstrates incredibly high performance in a narrow space and is either prohibitively expensive or ineffective in other arenas.

Even so, it is impossible to build a modern, efficient jet engine with just typical carbon steel.

The bottom line?

Profound strength and profound weakness often come together. Yet without a spectrum of those profound strengths at your disposal, it won't be possible to develop cutting-edge technology at a competitive level.

I don't mean to say that a well-rounded developer can't also have profound strengths - I know several of these people. It's a no-brainer that you need to do what it takes to keep these gems. Reward reliable and well-rounded developers for their lack of weaknesses and, if you need their contributions, reward brittle, unorganized, high-maintenance, socially-inept (insert favorite weakness here) developers for their strengths. Or someone else will.

Notes

* Some of single crystal superalloys are nickel-iron alloys. If iron is the dominant metal in the alloy, these superalloys might be considered steel.

For More Information

Now, Discover Your Strengths

Rolls Royce Trent Aeroengine

Excellent PowerPoint of the Rolls Royce Trent Aeroengine series commercial jet engines
Rolls Royce Trent Aeroengine engine materials

Wikipedia: Iron Steel Ceramic Cermet Silicon Carbide Superalloys Complex Metal Alloy Composites

Development of Single Crystal Superalloys

Tuesday, February 06, 2007

Senzee's Rule of Interface Complexity

If what an interface is supposed to do, how it's supposed to do it and the role it plays in a system is not apparent in five minutes of staring at it, it's too complex.

The Impossible Dream #2

Previous: The Impossible Dream #1

Ground Rules

Mantra 1: The mechanic first, then the game.

It's no fun putting blood, sweat and tears into an unfun game. We structure the game around a solid mechanic - prototype first to find addictive key element(s) and elaborate. Tag is one key element we've identified.

Mantra 2: Build a game, not an engine.

I've seen a number of home and indie game projects go south stuck at the "let's build our engine" phase. Nothing overly general - we build into the code what we need to do the trick. I'm not saying there's no place for useful abstraction here, I'm saying that a general purpose rendering or game engine is not our goal.

Mantra 3: Fit the design to resources at hand.

Building an open world game at all is our first violation of this. Still, we need to embrace this reality instead of fighting it. Operating with restricted resources can force us to come up with creative solutions to problems we otherwise wouldn't have looked at. Of course, it may also prevent us from finishing anything ever.

Mantra 4: Don't underengineer or overengineer, but if you must choose one, underengineer.

Why? Because underengineering is easier to fix.


A Unflinching View

Let's start the impossible dream by looking, somewhat realistically, at our available resources and the strengths and weaknesses of our position.

Resources
  • A few development hours a month divided among game design, software development and content creation.
  • A powerful target platform (PC).
  • Development tools - Visual C++ 2003.
  • Direct3D and its auxiliary libraries (see No need for portability in Strengths, below) .
  • Third party content creation tools - Milkshape 3D (registered), Unwrap3D (registered), POV-Ray.
  • Open source and free resources on the web for game development.
  • Free content from other sources that can be legally used in our game.
Strengths
  • Strong software development skills.
  • A powerful target platform (PC).
  • No need for portability.
  • A great deal of freely available software for offline and runtime use.
Weaknesses
  • Poor content creation ability.
  • Few hours to dedicate to the project - it'll be done in like one hundred years.
  • Little monetary investment.
Strategies

Make the most of our particular set of resources:
  • Simplicity - a simple, compelling core game. No minigames, nothing like that.
  • Prototype the game mechanic first.
  • Automation - machine generated content, mostly offline, possibly at runtime.
  • Reuse/scavenge (to a point) freely available content generation tools and software.
  • Reuse/scavenge (to a point) freely available software for the runtime.
  • Seek out free content that we can legally use in game.
  • Creatively adapt the game design, including character and plot, to cater to our strengths.
  • Use a hooded figure for the main character, allowing us to skirt the animation issue for the main character. We can use a simple segmented model for NPC's. We can animate the hood & clothing of the main character with cloth animation, which is CPU-intensive, but nearly content-free.
  • Use public domain (classical, folk music?) music for the soundtrack.

Oh, and yes, there's so much more to come.

Monday, February 05, 2007

Red 5's Pitch

(First off, special thanks go to Ryan Burkett who allowed me to use his digital camera to photograph these items.)

Last week I received a Fed-Ex package at my desk. All excited to get a package, (kind of like those people who win on the radio and say "I never win anything!!") I grabbed Ryan Burkett to open it with me. Inside there was a ~3"x~6"x~1" paper box that looked like some sort of puzzle. The box was orange and black with an artist's rendering of the birth of a planet. It carried a large white digit 1 in the lower left-hand corner. Small print above the 1 read:

R5L_001 TRACE
not the end
a beginning
a new beginning



One corner was cut out and a slightly smaller box was visible inside. I opened the first and pulled out the second. It was similar, with olive green instead of orange and a large 2 in the corner. It sported an illustration of a shard of molten rock (or perhaps the broken wing of a spacecraft?) and read:

R5L_002 TRACE
not a new world
but an old world
made new

So, I figured this is some kind of Matryoshka doll package. I pulled out the third box, sea-green, marked with a large white 3 and a primordial underwater scene.

R5L_003 TRACE
no change is peaceful
and through life struggles
it also thrives

Out came number four, peach-orange with a river of lava flowing through the gaping maw of a mountain pass.

R5L_004 TRACE
to forge a new path
through the darkness
to rise to the call of glory



And then finally number five, red with the image of a blue elf from Red 5's website and the text:

R5L_005 TRACE
not the end
a new beginning
with new eyes to greet it



The fifth opens like a book or a box of software and inside was an iPod Shuffle with my name engraved on it!





Now pausing for a moment, it occurred to me that everything about this is loaded with references that I don't think I fully understand. The images and text seem sort of Genesis with the days of creation and all, but that's six days, not five. So I'm not sure if that's the reference, but it seems an apt associaton with the creation of a new studio, a new game and the new changes of the people's lives who are coming to join Red 5.

So I power up the iPod and listen to track 1.

Paul, this is Mark Kern, President of Red 5 Studios and former team lead for World of Warcraft. I came across your blog on the net and was impressed with the depth of your inquisitiveness regarding game design, programming and mathematics. Your work at Tiburon is impressive and we'd love the chance to meet you. At Red 5 we're assembling a team of incredibly talented individuals dedicated to pushing the envelope in online entertainment. We're building a new type of game company and a new type of game. And we believe you're someone who just might fit into the Red 5 family. Log into red5studios.com and enter the code found engraved on this iPod. We'll tell you all about it.

So I pull up the site:

Who are we?

Red 5 is a creator-owned, newly formed studio driven to create new and orginal massively multiplayer online entertainment. We are world-builders, storytellers and game makers. We own what we make, and we have the financial backing to stand on our own.

So Here’s Why We’re Calling.

Red 5's CEO came across your blog and was impressed with the depth of your inquisitiveness regarding game design, programming, and mathematics. Your work at Tiburon is impressive, and we'd love the chance to meet you.

Because of this, you are one of one-hundred people in the industry we've invited to check out Red 5 Studios. We've played your games, and respect your skills.

Paul, we are hiring, so feel free to take a look at our open positions. We had a few in mind that might be a good fit, but we really believe in people and, specifically, your potential. If you don't see something that fits, we can talk about tailoring something to your specific talents.


Pretty impressive pitch, huh? Certainly very flattering!

Here's the original post.

Friday, February 02, 2007

The Impossible Dream #1

True Story

In Mexico I met a poor farmer named Miguelito who lived in the little town of Metepec (Atlixco), Puebla. Many years before, in a vision, the Holy Virgin of Guadalupe entrusted him to build a cathedral, high up on the hill above the town. She promised him that he would not die until he'd set the final stone. He quickly enlisted the help of the townspeople and over 15 years constructed a small, beautiful hilltop church. At the time Miguelito showed me the rust-colored church it was nearly complete and he explained that this meant that his time was near. His mistress had been generous enough - Miguel was 104 years old!

My Impossible Dream - My Own Open World Game

So, is it possible, working a few scant hours each week on evenings and weekends to build a (simple) open-world game by myself? We shall see. I've occasionally been working on the game since about a year before I came to EA but I think I'll make a more public spectacle out of it on this blog - to keep me motivated.

So why do I even want to do this? Because it's impossible.

Some prototype screenshots
Some generated content and concepts

Next Time: in the next installment of The Impossible Dream, we'll look at what this game will be about and how we will use automatic content generation to build our world.

Thursday, February 01, 2007

Red 5 Studios

This week I received a package from Mark Kern, former team lead of World of Warcraft and current CEO of Red 5 Studios who is, it turns out, a fan of my blog! Before I delve into more, let me say that Mark and company have the coolest recruitment strategy I've ever seen. And although I'm not looking to leave EA Tiburon at this point in my career, perhaps there's a future out there..

Mark and I spoke for some time about his company. Red 5 aims for the center of the emerging MMO phenomenon in ways that transcend east-west cultural boundaries to create an experience as compelling for Chinese and Korean gaming communities as it is for American, European and Japanese communities. What's more, Red 5 is staffed with a large number of ex-Blizzard game makers who know how to build the kinds of MMOGs that can do this. As an American company at ease with Asian cultures, Red 5 has given an enormous amount of thought to handling the cultural nuances to make this work. I'm not an expert when it comes to MMOs, but I'm convinced. If there's a developer to back in the MMO space, it's these guys.

Red 5 Taps Some Green
Red 5 Studios Aims To Be Pixar of Online Gaming

Wednesday, January 31, 2007

Rules of Play

I didn't get to work on an experimental gameplay type project as I'd hoped this holiday break. However, I did get a ton of stuff done that I'd been putting off for quite some time - and that's fantastic!

I borrowed a copy of Rules of Play today that I'm finally going to read after much recommendation. Also, I grabbed a book on architecture for my little city generator side-project. Maybe I should check out this as well. Speaking of which, perhaps I'll meet Will Wright one of these days since we work for the same company and all.

I got Gears of War and Marvel: Ultimate Alliance for the 360 for Christmas, both of which are great!

Wednesday, January 24, 2007

7

(Updated 7/10/07)
Inspired by the interest in my 5-card poker hand code that plugs into Cactus Kev's evaluator, I've decided to revisit my unholy 7-card evaluator and make a faster?, cleaner one that I can then post up here.

For the 5-card hash I used Bob Jenkin's Perfect Hashing code. Check out his excellent site for great perfect hashing code & ideas.

My current 7 card evaluator first determines if there is a flush. If not, it looks up the final value in a 13 * 13 * 13 * 13 * 13 * 13 * 13 (13^7, 63M entries) precalc'd table. Arghh! If it is a flush, though, it evaluates all 21 combinations (7 choose 5) in the normal (albeit optimized) way.

But this is not how I want my grandchildren to remember my code. Let's think about other options. Now { 52 choose 7 } yields about 133 million possiblities, right? The first crucial step in thinking about optimizing the seven hand evaluation is figuring out a way to efficiently map every unique set of 7 out of 52 cards to one unique number of the 133 million possiblilities.

As it turns out, I've got some code to do that. Nevertheless, I need to do a little cleanup before I post that. So look for "7 Part II" sometime soon.. ;)

Part II: 52 Choose 7

As promised, code to map any 7 of 52 items (7 of 52 bits) to a unique index in the range of 0-133M (52 choose 7).

index52c7.h

This, of course, could be used for a super-fast 7 card hand evaluator with a precomputed table of size 266mb.

Jing, commenting below mentions that a 2+2 forum has some super-fast seven hand evaluators. Glancing briefly at the site I notice claims of 12.5 cycles per evaluation, which seems too good to be true. After all, a single out-of-cache table lookup can cost much more than that. But if it is true - sweet!

Some Clarifications

Andy Reagan emailed me and made some excellent points concerning the readability of index52c7.h.

"[It's hard] to understand what the code was doing without comments and with the generalized table and variable names.."

I apologize for that. Actually, I wrote another program to generate this file, which is one of the reasons why it's so obtuse. It would probably be a good idea to publish the generator program as well, except that I lost it in a hard drive crash.  There were few casualties, but sadly, that was one of them.

"What does the function index52c7 do?"

Here's the reasoning for index52c7:

We can completely represent a hand of 7 cards of 52 with a single 52-bit number with 7 bits set. We assign each possible card in a deck a number between 1 and 52, inclusive. For example, the Queen/Hearts might be 43 and 2/Spades might be 17. Then, we take a 64-bit number (large enough to contain the 52-bits) and set a bit for each of the 7 cards we have. If two of the seven cards we have are Queen/Hearts and 2/Spades we'd set bits 43 and 17 along with the bits that correspond to the other five cards.

Now, if we had unlimited memory, we could just use this number as an index into an enormous and very sparse array. Unfortunately, this array would have 2^52 (4.5 quadrillion) entries. Assuming two bytes per entry, that would require 9 petabytes of memory! So we need to somehow hash this number into a much smaller space. It turns out that the number of possible combinations of 7 items among 52 is about 133 million (52 choose 7), so ideally, we could somehow hash the 52 bit number into a number between 0 and 133 million that uniquely identifies a given hand.

That's what index52c7 does. It translates the 52 bit hand representation into a much smaller, but still unique number. At two bytes per entry, that gives us a table of 266 megabytes, which is large and in certain cases inconvenient, but certainly doable.

Using, say, Cactus Kev's code to evaluate each possible 7-card hand, we'd first generate the 266MB table and populate it by looking up the corresponding index with index52c7. Now that the table's fully populated, we can just pass index52c7 the 52-bit number and use the resulting index to pull the answer straight out of the array.