Page 2 of 2 First 12
  1. #11
    mattekure's Avatar
    Join Date
    Feb 2018
    Location
    Virginia, USA
    Posts
    531
    With the limited testing, it does seem like it might be limited within a single function/process space. But I've noticed, at least with FG Classic, that if I do a bunch of nodes, like 100 or so, the memory is not release. So it will balloon up to 2.5Gb, and when the process completes, it stays in use. In Unity, it never rose anywhere near as high, and went right back down once the process completed. So I agree with MoonWizard that it may be a stack overflow somewhere.

    It might be possible to chunk it. The easiest way I can think of is to load all the data into memory without creating the nodes. Store it in a global variable somewhere. Then in a processing window have a button that says "Process the next 100 records" or something. Perhaps include a countdown for the number of lines left to process. This would require the user to have to click it multiple times, but thats better than having it crash or hang.

  2. #12
    You mention it seems the memory isn't being released - I don't know exactly how FG sandboxes its Lua calls, but is it possible to manually trigger garbage collection (with the aptly named collectgarbage() function) after processing X number of entities? It might create a temporary CPU spike but if the issue is memory usage it might be worth the extra cycles. It could require some tweaking, such as setting values to nil prior to invoking the GC. I don't know specifically how Lua's GC model works, but it might not invoke its collection of resources until your while loop finishes - or worse, until the function completes and the local variables are no longer referenced - so forcing it to work part way through could be an option.

  3. #13
    celestian's Avatar
    Join Date
    Jan 2017
    Location
    DFW, Texas
    Posts
    2,191
    Quote Originally Posted by celestian View Post
    I've tried various things to resolve this issue under FGC ... even adding a sleep after X processes. I think the only way to actually "fix" it is to process X amount, store the rest of the text in a buffer, prompt the user for "next" batch and then restart.

    The only reason I've not tried the last method is... FGU doesn't give me this problem so... I've not tested but I'm pretty sure it would work ... assuming I properly understand the issues causing it.
    So this turned out to be pretty easy to setup with my current import process. It does seem to work. In short, I count each line I process, when I reach XX number I start storing the remaining entries and do not process them. After the loop (stepping through each line) I check to see if "aImportRemaining" is > 0 and if so I place the remaining process "text" (no longer has the entries we processed) into the text field in the import window and prompt the user to press IMPORT again for the next batch.

    It's a bit tedious but it gets around the whole memory issue thing.

    This would be better handled using a external python/perl script to generate the data due to the size of the import block but normal people aren't going to have that laying around. This will be the "simplest" for most novice users, it'll just take a while to finish.

    The code I used to do this will be available in a repo once the project is ready for public view (expect this weekend).
    ---
    Coding the Official AD&D Ruleset
    My Twitch Channel for AD&D and FG related streams (See schedule for live days)
    My YouTube for FG related Tutorials and AD&D Actual Plays
    Custom Maps (I2, S4, T1-4, Barrowmaze,Lost City of Barakus)

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

Log in

Log in