Whyyyyy, WHY did I say on discord that the next alpha update would be ‘soon’? It angered the demo-gods and we’ve been dealing with an asset-loading problem ever since.
That issue has now been fixed and I feel like we know when that update will be landing. However I am not evening hinting at when that could be given what wrath that might incur.
We have also been moving our assets out of Unity Collab. We have been using Collab for a while but there are aspects to it that make it painful for teams and there has been no sign that it will improve. For now we are using Git LFS which is what we have been using for the game itself for over a year. Git LFS, once set up, it’s been pretty good to us so far.
The money from the Kickstarter has landed and @Ree has been busy working on the business side of things. We are looking forward to both being able to say we are officially hired by Bouncyrock and also to give more little bits of info regarding what happens to money after you make your pledge.
I (Baggers) have just got back from a long weekend with family in the west of Norway. The last year and a half has been a lot of work so it’s really nice to be seeing folks again.
More news as it happens!
It’s been a nice day today, both @Ree and I got some coding done. Here are a few highlights:
Dice feel regressed a while back and Ree got that fixed back up again.
He also updated the implementation of the radial menus which we hope has fixed the issues with some stats not showing. I tested this but we had never been able to get a reliable reproduction of the issue so we will see how this fix holds up in the wild.
I found a interesting default behavior in the networking library we used which was resulting in huge hangs when updating certain player information. Once found the fix was trivial.
We have enabled Unity’s incremental garbage collector to smooth over another GC spike we were seeing. It’s very much a band-aid but it’s a nice one for the alpha before we get to addressing the core issues behind the allocations.
More news and an update to the alpha in the near future.
Day two of the dev planning and again things went well. The main job today was going back through the task lists and working out which ones would be dependent on which. This is important so we don’t get blocked by each other and that, for the cases where the is a dependency, there will be work the blocked person can be doing.
We wont be putting out these orders as they will change and we’d like to keep things flexible.
Following on from yesterdays work I also got the backend running fully without and internet connection so I’m hoping that with a little more work I can point Unity at my laptop and run the whole TaleSpire stack locally. This will be a big win for iteration time when developing the backend.
Lastly and most prettily @Dwarf has made an awesome looking Kobold which we will be seeing in game shots of in the near future.
That’s all for today,
The end of last week saw me playing with a cool toy so I thought I’d write about it.
First though, I’ve carried on with my digging into various erlang bits and bobs.
With a bit of prodding I was able to get epmdless working with my erlang/docker setup. This is great as distributed erlang is one of the powerful features that many erlang libraries lean on and previously this was hampered by docker requiring explicit port mappings. If you are interested in epmdless in conjunction with docker you can check out this article here.
I also spent some quality time getting to know gproc, erlbus, websockets a little better and I’m much happier now with where I’d use them in my projects. I’ve not got much more concrete to say on these but having a good understanding of common tools is helping with the design of the next iteration of the backend (more news on this in future too).
Lastly as fun bit (for me). For a time I was really enjoying that I was able to use docker to test the TaleSpire backend locally. I had webservers, db servers etc on their own little network and I could get a reasonable idea of how things should run. However at one point we started using amazon’s S3 for storage and that put a spanner in the works as I couldn’t test that locally. I could make a alternative approach for local dev but that’s more code to maintain and that could fail in ways that confuse my testing. Luckily there is a project called minio which runs in a docker container and presents the same api as S3. I’ve already modified the image to include my preferred tweaks and have tested making presigned urls which work GREAT.
So now I get to go remove some ‘local dev’ hacks and I get a simpler, more realistic, fully local, dev environment. LOVE IT.
Alright, that’s all the fun nerdage for now, back to planning :)
One thing I got stuck on initially with epmdless was that I was using
rebar3 shell in the entrypoint script for my container. This will not work with epmdless as the epmd module vm args would need to be passed to shell and that would then freak out as it wouldnt know where to find the epmdless module. So instead try having the entrypoint script make a debug release of your erlang app and then start that in foreground mode. Then your
config/vm.args will be used and everything will work fine.
Also, remember to expose the remsh port for epmdless so you can use the remote shell. I used this for my erlang repl settings in emacs. Once you have the official examples running most of this should be fairly self explanatory.
(setq inferior-erlang-machine-options '("-env" "EPMDLESS_DIST_PORT" "18999" "-env" "EPMDLESS_REMSH_PORT" "18000" "-name" "email@example.com" "-setcookie" "cookie" "-remsh" "firstname.lastname@example.org" "-proto_dist" "epmdless_proto" "-epmd_module" "epmdless_client" "-pa" "/app/_build/default/lib/epmdless/ebin/"))
The end of last week looked at lot like this:
This was because I want sketching out ideas for the undo/redo system and I wanted a nice simple model for doing this. This is 4x4 map where tiles are represented by letters.
These tests allowed me to find small mistakes in my assumptions and to fix those in a much simpler (and less distracting) environment than doing the work in Unity. It was made it trivial to test out how I want to resolve the conflict between needed to apply changes immediately and needing to apply changes in the same order on all clients to get the right result.
I was using lisp for this as it’s where I’m most comfortable and also because the ease of recompiling smaller chunks of code just makes for nice fast iteration times.
Since then I’ve been looking at fog of war but that has mainly been prep for the planning sessions that will take place when Ree is well. So nothing to show there is it mainly amounts to lots of pages of my notebook being covered in scribbles :)
For now, that’s the update.
Keep an eye out for more news coming soon!
Heya folks, behind the scenes I’ve been working on the next version of the undo/redo system along with new data layouts to make batching (for rendering) faster.
Undo/redo is one of the areas we’ve had serious lag before when working with large slabs of tiles. When looking at the code it was understandable why, however, there are interesting complexities to the system when you have multiple people adding and deleting tiles concurrently.
For the last two days, I’ve been working on a new design for the system. When doing we also have to make sure that the data layout we choose for this doesn’t make it slow to perform frequent tasks like rendering.
This is one of the bits I love with this job, sitting down with a pen and paper and just wrestling with a design.
I think I have something now but I’m not going to put it here yet as it needs some further validation. What I’ll do next is code a small prototype so I can see the properties of the system more clearly.
More details on that in the coming days :)
Research continues and, for now, that isn’t giving me much to show so I took an hour out to explore a requested feature: NDI Support.
NDI is a standard that lets applications deliver video streams via a local area network. For us, we are interested in being able to take these video feeds and use them in TaleSpire. This originally came up as a request from the community as it is apparently a popular way for streamers to integrate video (such as their player’s skype chat) into their streams.
It sounded cool and luckily where is already a Unity package available for working with NDI. It worked like a charm so HUGE props to keijiro for that great work.
This was a very limited test but it is encouraging. We would need to do a bunch more work to be comfortable shipping this and we wouldnt want to mess with the roadmap we have promised the Kickstarter backers. All that said this was a fun test and one I’m excited to revisit if it makes sense later on.
That’s all for now
Another short dev log while we are still in the Kickstarter campaign.
Recently I’ve carried on studying. It looks like the hybrid renderer doesn’t handle per-instance data yet (which is mad) so we won’t be able to use that yet. This means we will probably want to avoid Unity’s ECS for now and just work with something similar but custom until that stack matures. This is no real issue as although we will have to do a little bit of tooling work when we need insight we do have experience in that.
Ree has always been the lead dev when working with graphics and shaders in Unity as he has many years of experience beyond me. I (Baggers) come with a decent amount from the GL side though so I’ve spent some time getting more used to hlsl and tooling we’ve been using. It’s fun, a lot of things are set up for you of course so I’m not sure (without the scriptable rendering pipeline) how one really packs data or does very custom stuff, but there’s plenty of time to learn that and the team has bags of experience I can lean on. This is really just making sure we are cross skilled enough that we don’t bottleneck each other too much (although when it comes to content creation I’m hopeless :D)
That’s all for today,
A quick update for today.
The day was mostly spent reading up on the new Unity.Physics package to see if it will be viable for use in TaleSpire. It is a preview package and that does come with a level of risk, however we only use the currently physics system for dice and for casting rays against for tools. This means our requirements are very simple and even at it’s current state the package has what we would need, assuming of course that it works as documented.
It certainly has some surprising aspects such as how it’s not framerate independent by default, but handling that seems to be easy enough.
Heya folks, yesterday and today I have spent getting familiar with Unity’s Job Systems and new ECS.
I’ve written toy ECS’ before and am currently working on a little optimizing compiler for querying table data so a lot of things were very familiar. The bulk of the video and prose content for Unity’s new systems is focused on the fact that the new thing is ‘different, but don’t worry it’s fast and not that hard’ etc, etc. As I’m already sold on the premise this is nice but of limited here. Beyond that, you better be comfortable reading other people’s code :p
The best resource so far has been the ECS Samples. It’s a small of samples evolving a piece of code from a
foreach on the main thread, through different flavors of jobified tasks. One of the biggest advantages of the code is just seeing what is still in use. If you glanced at the cheatsheet you’d be missing certain things (like sync points).
The reference docs are passable but there are plenty of things where the description of a method is the name. When I hit walls in C# I often read Mono’s source to get an idea of what to expect, but naturally many of the things in the new ECS are either proprietary or are in c++ code I can’t reach easily so those missing doc-strings become a pain point.
Given that, in TaleSpire, we load most assets dynamically I’d like to avoid having to use the Entity conversion routines on load of each asset, I’m fine with running them in TaleSpire and serializing something useful, but I’m not sure how we prefabs play with ECS entities at this stage. I’d be very surprised if Unity’s mega-city demo was using conversion routines so I’m sure there is still more to understand here. Also the lack of documentation around the
Hybrid.Renderer is a pain in the proverbials, I’d really like to look into zone culling in TaleSpire to potentially improve rendering performance but information seems scarce here. I’m sure it’s all in the mega-city demo but that doesn’t feel like the best introductory material to be using.
Regardless, between the samples and these docs, I’m making decent enough progress. We’ll definitely be using this in TaleSpire and I’ll have this under my belt well before the campaign ends.