I've heard of a surprising number of games that use a "human memory allocator". In that if you want memory, you email Bob with your request and which level(s) you want it for, and Bob looks at his Excel spreadsheet and emails you back an address. This sounds like complete lunacy, but people manage to ship stuff like that, so hey.
Future game consoles will be four feet tall and contain a human coprocessor (an emotion engine if you will) for all computer vision and natural language processing tasks. It will be the best text-to-speech coprocessor ever invented!
And it's good the economy, it will employ one of the billions of excess workers.
(* game console my smell of feces after a few days; game consoles are disposable and should not be kept more than a week)
The Windows Kernel Paged Pool shit I'm dealing with is reminding me how much I hate fixed size memory pools.
I have fucking 3 Gigs of RAM in this machine and I'm using less than 1 G ! How can you be running out of memory !!!! Oh, because you have a fucking
stupid fixed size page thing. Now, okay, maybe *MAYBE* the OS kernel is one case in which fixed size pools is a good thing.
It's common wisdom in game development that dynamic allocations in games are bad and that "mature" people use fixed size pools because it's more stable
and safe from fragmentation and robust and so on. Hog wash!
It should be obvious why you would want variable memory allocation. Memory is one of our primary limiting
factors these days, and it should be allocated to whatever needs it right now. When you have fixed pools
it means that you are preventing the most important thing from getting memory in some case.
For example, your artists want to make a super high poly detailed background portion of the game
with no NPC's. Oh, no, sorry, you can't do that, we're reserving that memory for 32 NPC's all the
time even though you have none here. In another part of the game, the artists want to have super
simple everything and then 64 NPC's. Oh no, sorry, you only get 32 even though you could run more
because we're reserving space for lots of other junk that isn't in this part of the game.
Now, I'm not saying that budgets for artists is a bad thing. Obviously artists need clear guidelines
about what will run fast enough and fit in memory. But having global fixed limits is a weak cop out way to
do that.
Furthermore, recycling pools and maximum counts for spawned items is a perfectly reasonable thing to do. But
I don't think of that as a way of dividing up the available memory - it's a way of preventing buggy art from
screwing up the system, or just lazy artists from making mistakes. For example, having a maximum particle count
doesn't mean you should go ahead and preallocate all those particles, cuz you might want to use that memory
for something else in other cases (and of course the hard-fixed-size pool thing can
In general I'm not talking here about *dynamic* variation. I'm talking about *static* variation. Like
anything that can be spawned or trigger from scripts or whatever, stuff that can be created by the player -
that stuff should be premade. Anything that *could* exist at a given spot *should* exist. That way you
know that no player action can crash you. Note that this really just a way to avoid testing all the combinatorics
of different play possibilities.
By static variation I mean, in room 1 you might have resource allocation like {16 NPC's, 100 MB of textures} ,
in room 2 you might have {8 NPC's, 150 MB of textures}.
Fixed sized budgets is like if you partitioned your hard disk in half for programs and data. People used to do things like
that, but we all now realize it's dumb, it's better just to have one big disk and that way you can change how
you are using things as need arises.
Now, people sometimes worry about fragmentation. That may or may not be an issue for you. On Stranger on XBox
it basically wasn't an issue because we had 64M or physical memory and 2G of virtual address space, so you have
tons of slack. Again now with 64 bit pointers you have the same kind of safety and don't have to worry.
Sadly, 32-bit Windows right now is actually in a really bad spot where the amount of physical memory roughly
matches the address space, and we actually want to use most of that. That is fragmentation danger land.
However, doing variable size budgets doesn't necessarily increase your fragmentation at all. The only thing that
would give you fragmentation is if you are dynamically allocating and freeing things of different sizes. Now of
course you shouldn't do that !
One option is just to tear things all the way down and build them all the way back up for each level. That way
you allocate {A,B,C,D} in order, then you free it all so you get back to empty {} , then next level you allocate
{C,B,B,A,E} and there's no fragmentation worry. (if you tried to do a minimal transition between those sets
by doing like -D +B+E then you could have problems).
Another option is relocatable memory. We did this for Stranger for the "Contiguous" physical memory. Even though
virtual address fragmentation wasn't an issue, physical memory (for graphics) fragmentation was. But all our
big resources were relocatable anyway because they were designed for paging. So when the contiguous memory got
fragmented we just slid down the blogs to defrag it, just like you defrag a disk. Our resources were well
designed for fast paging, so this was very fast - only a few thousand clocks to defrag the memory, and it only had
to be done at paging area transitions.
Note that "relocatable resources" is kind of a cool handy thing to have in any case. It lets you load them "flat"
into memory and then just rebase the whole thing and boom it's ready to use.
Personally after being a console dev and now seeing the shit I'm seeing with Oodle, I would be terrified of releasing
a game on a PC. Even if you are very good about your memory use, your textures and VB's and so on create driver
resources and you have no idea how big those are, and it will vary from system to system. The AGP aperture and
the Video-RAM shadow eat out huge pieces of your virtual address space. The kernel has an unknown amount of
available mem, and of course who knows what other apps are running (if it's a typical consumer machine it probably
has antivirus and the whole MS bloatware installed and running all the time).
I don't see how you can use even 512 MB on a PC and get reliable execution. I guess the only robust solution is to
be super scalable and not count on anything. Assume that mallocs or any system call can fail at any time, and
downgrade your functionality to cope.
Now, certainly doing prereserved buckets and zero allocations and all that does have its merits, mainly in convenience.
It's very easy as a developer to verify that what you're doing fits the "rules" - you just look at your allocation count
and if it's not zero that's a bug.
It's just very frustrating when someone is telling you "out of memory" when you're sitting there staring at 1 GB
of free memory. WTF, I have memory for you right here, please use it.
The other important thing is efficiency is irrelevant if you can't target it where you need it. Having a really
efficient banana picking operation doesn't do you a lick of good when everyone wants apples. A lot of game
coders miss the importance of this point. It's better to run at 90% efficiency or so, but be flexible enough
to target your power at exactly what's needed at any moment.
Like with a fixed system maybe you can handle 100 MB of textures and 50 MB of geometry very efficiently. That's
awesome if that's what the scene really needs. But usually that is not the exactly ideal use of memory for a given
spot. Maybe some spot really only needs 10 MB of geometry data. You still only provide 100 MB of textures. I'm
variable and can now provide 139 MB of textures. (I lose 1 MB due to overhead from being variable).
Many game devs see that and think "ZOMG you have 1 MB of overhead that's unacceptable". In reality, you have 39 MB
less of the actual resource your artists want in that spot.
"03-12-09 - Fixed Memory Pools"
3 Comments -
I've heard of a surprising number of games that use a "human memory allocator". In that if you want memory, you email Bob with your request and which level(s) you want it for, and Bob looks at his Excel spreadsheet and emails you back an address. This sounds like complete lunacy, but people manage to ship stuff like that, so hey.
March 13, 2009 at 10:11 PM
Future game consoles will be four feet tall and contain a human coprocessor (an emotion engine if you will) for all computer vision and natural language processing tasks. It will be the best text-to-speech coprocessor ever invented!
And it's good the economy, it will employ one of the billions of excess workers.
(* game console my smell of feces after a few days; game consoles are disposable and should not be kept more than a week)
March 13, 2009 at 10:42 PM
http://en.wikipedia.org/wiki/The_Turk
March 15, 2009 at 1:58 PM