Memory leak?

Monkey Forums/Monkey Beginners/Memory leak?

TVCruelty(Posted 2015) [#1]
Hi,

I'd like some advice on the memory usage of my Mojo app.

When I run my program as a desktop PC app and watch memory usage in Task Manager, I can see it increase every time I interact with the game. It's a strategy game, so I have code that immediately exits OnRender if there's been no mouse or keyboard use, but this merely minimises the problem.

If, that is, it IS a problem.

If I leave the thing alone for a while, memory usage drops, presumably because the GC has kicked in.

My question is: do I have a memory leak or is this expected behaviour? The program is far too big to post up but, for info, it uses lots of image scaling, lots of FontMachine text, with scaling, and lots of multi-dimensional arrays and objects. I've been careful to manage my objects' life-cycles, I think. Some of the images are around 3Mb in size. Memory usage can go up by about 0.2 Mb every few seconds...

The games runs smoothly on my PC and on my iPad, not so well as HTML5 on an oldish laptop. Maybe I'm just worrying too much?

Any advice gratefully received.

Thanks,
Ian


Jesse(Posted 2015) [#2]
Yes, you are worrying too much. GC doesn't kick in on every update. it more or less kicks in after a certain amount of time. As long as it doesn't crash your game, its ok.

About HTML5. my games in my Macbook Safari run smooth while in chrome they jitter a lot. I know a lot of programmers have been complaining about it so I would think that's normal behavior.


ImmutableOctet(SKNG)(Posted 2015) [#3]
Well, there are garbage collector settings if you're that worried, however, this sounds like you're either worrying too much, or you're not reusing memory everywhere you can. The GLFW targets (As well as the iOS target) use Monkey's custom built garbage collector. On desktop hardware, it performs great, but the mindset of reusing every object you create is a good one for cross-platform development. This idea mainly came to independent developers from platforms like Android which have had historically terrible garbage collectors. Basically, the point is to keep overhead down by making the memory environment as static as possible. Many commercial console games do something similar by statically allocating most of what they need in memory; thus why a lot of PC ports of console games are memory hogs.

But, there is a method to this madness; you'd be saving as many cycles as you can while updating. You wouldn't have to wait for the heap, GC, or other external system. In the case of garbage collection, keeping objects around in pools and then reusing them can reduce overhead. This is because you'd be minimizing the amount of garbage generated for the collector. Sure, a few references (Pointers) are being passed around, but that's a lot less than having the GC handle the dead objects, as well as the new objects being created regularly. Creating an object during your game's main runtime (When 'OnUpdate' is being called many times per-second) is inevitable, but if you can statically allocate within reason, the overall benefits are huge.

I mentioned pools before, Monkey actually has a standard 'Pool' class you could use. It's a basic framework, so you'll have to work with it yourself (Usually through basic wrappers). Monkey's standard 'Pool' will not perform any kind of explicit destruction or construction on an object. If you tell it a pool size (There's a default already; I recommend using a larger number), it'll allocate the number of objects you specify. This will be done using the parameterless overload of 'New' for that class ('New' does not need to be present if no overloads exist already). Destruction will not be done, so you'll have to set that up yourself. Calling the pool's 'Free' command will only add it back to the pool, so you'll need to either wrap or inherit from the 'Pool' class, if you want to add a custom call to your own destructor. The main point I want to emphasize here is to keep your objects managed, nothing created from a pool should have no way of returning. Of course, this won't break the system, but it can be counter-intuitive. You can't control what the user (Likely you or another developer on the project) does with the object, but you can provide a framework for that object to return. I personally have a system where I release objects using an "endpoint" with my own pseudo "free" system. Basically, I call an abstracted destructor (Called 'Release' as opposed to the manual destructor, 'Free') which then either returns the object to the endpoint, or it flat-out destructs the object, depending on the situation, of course. This makes my job as the person handling the object a lot simpler. All I ever have to do is call 'Release', and I can assume the object is out of my hands. With systems like this, just keep in mind that references to objects will still be present, and likely work as well (Even if the data inside the object could be invalid for all you know), so you'll need to write your code with a bit of caution. This works similarly to manual memory management with an already deleted/freed object. But, the main benefit here is that even in the virtually impossible situation where you don't release an object before assuming you have, the garbage collector will already make sure there aren't any leaks.

I'm not one for garbage collection personally; I think it has its place and should only be an option, but even I think this is a solid system.

Basically, the moral of the story is: Android garbage collection is terrible, but keeping your memory footprint at a manageable minimum size throughout the main gameplay session is ideal. Don't call 'New' for anything major in 'OnUpdate' and 'OnRender'; most of that "bandwidth" is likely already being used by enumerators and other small objects. Also keep in mind that some enumerators (Especially custom enumerators made for complex containers) could have large footprints. I have made an ancestral enumeration system for my own project which needs to be pooled in order to keep memory consistent. That's because it technically enumerates enumerators, but you get the idea.

If you go the pooling route, just make sure to keep things a manageable size; don't allocate too many objects statically unless you really need to. You could even run some tests to see how many objects are needed in your main update routine.

Also, about what you said earlier with regard to reducing uses of 'OnRender', your system probably works fine, but just keep in mind that the operating system will render the game whenever it wants. You may need to use something more advanced for this. But then again, I haven't seen your code.


TVCruelty(Posted 2015) [#4]
Thank you both for the reassuring words and advice, they're really appreciated. I'll have a good read through the Pool advice and see if I can incorporate the ideas.

At the moment I'm not planning an Android release, not least because of the fact I don't have any Android devices, but if the game's successful I'll probably want to give it a go.

Isn't Monkey X brilliant, though, and with such a supportive community? Can't believe its future is in doubt.


Samah(Posted 2015) [#5]
Diddy has two pooling systems the developer can use. There's an automatic global pool, and a pool-based stack that keeps track of both allocated and released objects.
To define a release/free method as mentioned, you just make your class implement IPoolable and implement the Reset method(). This saves you from having to extend Pool for every type.

If we have this test class:
Class Foo Implements IPoolable
	Field bar:Int
	
	Method Reset:Void()
		bar = 0
	End
End


We can use GlobalPool to reuse any number of (possibly unrelated) instances.
https://code.google.com/p/diddy/source/browse/src/diddy/globalpool.monkey
Local foo:Foo = GlobalPool<Foo>.Allocate()
GlobalPool<Foo>.Free(foo) ' foo.Reset() is called


Or we can use DiddyPool (which extends DiddyStack) if we want to keep a separate pool per stack.
https://code.google.com/p/diddy/source/browse/src/diddy/diddypool.monkey
Local dp:DiddyPool<Foo> = New DiddyPool<Foo>
Local foo:Foo = dp.Allocate() ' creates a new instance or reuses as necessary, returns the instance
dp.Allocate(5) ' creates 5 new instances or reuses as necessary, returns the last instance

For Local f:Foo = EachIn dp
	' loops through every allocated item (6)
Next

dp.Free(foo) ' foo.Reset() is called and foo is placed in the pool

For Local f:Foo = EachIn dp.FreeItems
	' loops through every freed pooled item (1)
Next

dp.FreeAll() ' releases all allocated objects in the pool


Disclaimer: I have not compiled this code.


Salmakis(Posted 2015) [#6]
a way to archieve some optimation here i know is,
that in some cases it can be better to have a class global or a field wich you fill up with stuff at the start of a function/method and use it then, as if it where a local object,
but thats depending if the object, or one of its child objects is not returned or used somewhere else (just abuse it as Local replacement).

also i try to Avoid usage of classes like .Vector as Local objects and prefer to have some Global / Class Global Vector objects wich i fill up temporary with currend needed values for a calculation, but thats not allways possible and may be not so elegant from the coding site, i just hardcore tested several Setups with vice and versa and saw that i can save alot Memory leaking / GC work here, wich under the line also can save CPU time


TVCruelty(Posted 2015) [#7]
A couple of (possibly) daft questions:

Can I include Mojo AND Diddy? Or does Diddy extend Mojo? I'd be pleased if I could use Diddy's RealMillisecs for my random number seed, for example.

Can a memory leak occur simply because of image and font drawing, rather than object creation? I ask because one part of my game doesn't instantiate any new objects but memory usage still increases. It DOES cycle through various lists and render lots of images and some bitmap fonts, though. According to Fraps, it maintains a steady 50fps on my (gaming) PC but, like I said, memory usage keeps climbing.


ImmutableOctet(SKNG)(Posted 2015) [#8]
Alright, hang on. Samah's pooling system at least makes some kind of sense, but I feel the "global pool" idea is rather awful, and other than enforcing a specific model, the use of interfaces is an unneeded requirement.

Salmakis (On the other hand) is proposing an idea that not only is improper practice, but all around pretty bad. The only time I'd recommend something like that is with controlled environments, not with global variables. Not only is this not thread-safe (Which means your code isn't future proof), but you're depending on a specific global element that otherwise has no basis in your code other than a vulgar way of caching. Use a pool, or at least keep the re-used object on a per-instance based or switchable level. For example, shared objects which can be used for single or multiple passes of a routine which are localized to the point of being reasonable. Here's my 'ioelement' module for reference. Anything less than a model which isolates the reused objects to some degree is asking for trouble; not to mention how bad the code would look. And to begin with, rougher implementations of this idea effectively compromise the modularity of your code.

As for using Diddy with a standard Mojo game, I assume you wouldn't have any issues, but your best bet is to either test it yourself, or to ask Samah directly. As far as the memory "leak" goes, what I said earlier partially touches on this. Monkey programs which use systems such as the standard enumerators will always be producing objects on your game's main run-time. The point is to reduce the number of objects generated within these steps. The garbage collector is always going to take up memory, so the best thing to do is to reduce how much you use it. In addition to this, different systems work differently with Monkey's garbage collector. For example, Windows will automatically adjust the number of pages allocated to a specific application. With variable amounts of memory used, Windows's heap will attempt to estimate the number of pages needed, and therefore, it can produce odd results. In addition to this variability, there's also the nature of Monkey's garbage collection. This is dependent on the garbage collection settings I mentioned before; basically, you can set specific thresholds and modes for the garbage collector in order to tell it when it should run. Assuming your "memory delta" is low, then the memory footprint you see in the task manager should be rather consistent. However, as I said, Windows makes estimates with what its heap gives each application. You can reduce the number of pages it gives you (The program's working-set) by using external code.

The best course of action is probably to adjust the GC as needed (If at all), and then stop worrying about it. Unless it's climbing without stopping itself, it's not a memory leak. It's probably just the GC being rather lax. To begin with, the only kind of memory leak you could possibly get would be from improper garbage collector behavior. A memory leak in a 100% garbage collected language is virtually impossible. The only thing that could even possibly be causing a full fledged leak in Monkey is the native code (Written by Mark or a third party), but that's language specific, and likely not the cause of a major leak. Mojo itself doesn't do anything too serious as far as Monkey objects go, so I doubt it's the cause of a major leak. You also have to understand that the memory differential could simply be system-dependent. I brought up Windows's heap before, but it could be higher-level than that. For example, this could be Windows reporting localized memory usage from what is being passed to the display server by WGL/other. There's also different levels of memory detection with the task manager and kernel (Public, private, shared, etc), which may actually be providing irrelevant information.

Basically, if it's not using too much memory, it's probably just par for the course. Adjust the desktop GC as you need to, but Monkey's garbage collected, so you should expect some variability in memory usage.

Here's a link to the 'EmptyWorkingSet' command, if you're interested; there's also related working-set commands in the Windows API. You'll have to use external C++ code to use these, however. (This is also Windows specific, if that wasn't obvious) Generally speaking, you shouldn't need to mess with the working-set unless you really need to keep things optimal, and you know what your program is going to use. For example, for a dedicated game server, reducing the working-set could be beneficial. Reducing the working-set will also be done automatically by the system when it needs pages which aren't actively used.


Samah(Posted 2015) [#9]
...but I feel the "global pool" idea is rather awful

This is a quick and dirty pooling implementation for lazy developers (like me) and/or people who don't want or need to use the full DiddyPool class.

...the use of interfaces is an unneeded requirement

I don't really see how this is a problem, but I suppose I could also put a ResetObject method in DiddyPool which gets called if the pooled object doesn't implement IPoolable.
Basically the developer would choose between "implement IPoolable and the Reset method" or "extend DiddyPool and override ResetObject".

The best course of action is probably to adjust the GC as needed (If at all), and then stop worrying about it.

This. Compilers and GC are written by smart people. :)

Edit:
Done. Example:
Strict
Import diddy.diddypool
Import diddy.containers

Class Foo 'Implements IPoolable
    Method Reset:Void()
        Print "Reset"
    End
End

Class FooPool Extends DiddyPool<Foo>
    Method ResetObject:Void(obj:Foo)
        Print "ResetObject"
    End
End

Function Main:Int()
    Local p:FooPool = New FooPool
    p.Allocate()
    p.Allocate()
    p.Allocate()
    p.FreeAll()
    Return 0
End

Uncomment the "implements IPoolable" to make it use Reset() instead of ResetObject().


ImmutableOctet(SKNG)(Posted 2015) [#10]
I was more so getting at the use of interfaces where generics/templates already covered it. But looking into it further, you're doing something really weird (Line 112). There's a dynamic cast, but shouldn't this either be forced as a requirement via the call, or forced via an inheriting pool-class? Dynamic casts are always costly as far as I'm concerned. Any chance I get to switch out a dynamic cast for a virtual function/method call (Or a template), I use. Just my two cents; I know dynamic casts aren't that costly, but I'm a stickler for saving cycles. Also, that code should be optimized, you're casting twice. The result of the cast should be cached on the stack as a local variable.


Samah(Posted 2015) [#11]
There's a dynamic cast

But that's how Monkey does "instanceof"...

forced via an inheriting pool-class

Monkey doesn't have bounded generics. :(

Edit:
Also, that code should be optimized, you're casting twice. The result of the cast should be cached on the stack as a local variable.

Good call, done.


Samah(Posted 2015) [#12]
Moved the IPoolable dynamic cast into the default implementation of ResetObject. If the developer overrides ResetObject they probably don't want the cast to be done at all.