[Code] A Thread On Lambdas And Other Epsilons


simplex

Recommended Posts

@debugman18

I made a fresh clone of the github U&A repo, tried running make and... almost everything was rebuilt, not just the Spriter projects. The issue is that in fact git sets the modification time of the pulled files to the time of the pull (precisely for compatibility with tools like make, so that pulling a new source file ensures rebuilding).

Ideally, byproducts of a build system should not be placed in a git repository, both to keep it smaller and to avoid issues like this. Right now, whenever anyone pulls a byproduct together with the sources, it will likely be rebuiilt when they run make. If we had more contributors dealing directly with the git repo, we'd be constantly committing unnecessary byproducts, such as anim zips and texs. Also, any fresh clone of the repo will rebuild almost everything, so any commit from it will include a huge amount of superfluous stuff.

I'd prefer to not have anim zips or texs in the git repo, with them all being gitignore'd and built locally. I just never advocated for that because it'd require any Windows contributors using git to have Cygwin in order to build the mod before running it (and Mac contributors to have XCode), and this includes testers willing to run the mod directly from the master branch (this does not affect packaged releases, of course). So I'm not sure how to handle this...

Big open source projects (such as Firefox) usually release "nightly" builds, which are packaged releases from the development repository done on a daily basis, and this is what primary testers use. But I'm not so sure this approach would be a good fit for U&A (besides requiring constant effort on our part in uploading these packages).

EDIT: An alternative is, of course, to stop using a blanket "make" taking care of everything and only build things individually, when you know the sources have changed. But I'd rather stay clear from that to avoid issues with forgetting to compile some files after a source change, as well as passing sub-ideal parameters to compilation (if doing it completely manually).

Link to comment
Share on other sites

@debugman18

I made a fresh clone of the github U&A repo, tried running make and... almost everything was rebuilt, not just the Spriter projects. The issue is that in fact git sets the modification time of the pulled files to the time of the pull (precisely for compatibility with tools like make, so that pulling a new source file ensures rebuilding).

Ideally, byproducts of a build system should not be placed in a git repository, both to keep it smaller and to avoid issues like this. Right now, whenever anyone pulls a byproduct together with the sources, it will likely be rebuiilt when they run make. If we had more contributors dealing directly with the git repo, we'd be constantly committing unnecessary byproducts, such as anim zips and texs. Also, any fresh clone of the repo will rebuild almost everything, so any commit from it will include a huge amount of superfluous stuff.

I'd prefer to not have anim zips or texs in the git repo, with them all being gitignore'd and built locally. I just never advocated for that because it'd require any Windows contributors using git to have Cygwin in order to build the mod before running it (and Mac contributors to have XCode), and this includes testers willing to run the mod directly from the master branch (this does not affect packaged releases, of course). So I'm not sure how to handle this...

Big open source projects (such as Firefox) usually release "nightly" builds, which are packaged releases from the development repository done on a daily basis, and this is what primary testers use. But I'm not so sure this approach would be a good fit for U&A (besides requiring constant effort on our part in uploading these packages).

EDIT: An alternative is, of course, to stop using a blanket "make" taking care of everything and only build things individually, when you know the sources have changed. But I'd rather stay clear from that to avoid issues with forgetting to compile some files after a source change, as well as passing sub-ideal parameters to compilation (if doing it completely manually).

 

Hmm. This is a tricky to deal with. On one hand, having everything in the repo, as you described, bloats it, certainly. On the other hand, the convenience is very nice. Although, having each person (mostly you and I) maintain their own compiled files means that if there is an issue with a particular animation, it's much harder for one or the other to notice, as evidenced by the numerous times that we've had differing results from images and animation compilations.

 

As for nightly builds, I agree with you; it doesn't really fit U&A. When it comes to mods, it's preferable for a user to not have to redownload it frequently, even when testing. Having everything on the repo allows for testers to stay up to date, similarly to nightly builds, except that they don't have to redownload the entire mod every time we change something, they just have to pull. On the Steam workshop (provided we can solve that problem) updating isn't an issue. On the forums, however, this is a big roadblock to nightly updates.

 

We could, alternatively, have a 'raw' repo, for such testing and convenience, but that would require more effort on our part, and that solution doesn't seem ideal.

Link to comment
Share on other sites

Although, having each person (mostly you and I) maintain their own compiled files means that if there is an issue with a particular animation, it's much harder for one or the other to notice, as evidenced by the numerous times that we've had differing results from images and animation compilations.

As long as we both keep the tools (mod tools and ktech) up to date, this shouldn't matter (but of course that's easy for me to say, since given that I'm the one pushing the changes to both I'll always have them up to date :razz:). The only case when difference in results was seen with both of us havin the same tools was when the mipmaps were being generated poorly, and in this case it was the screen resolution that mattered, not the images/anims (which were the same).

As for nightly builds, I agree with you; it doesn't really fit U&A. When it comes to mods, it's preferable for a user to not have to redownload it frequently, even when testing. Having everything on the repo allows for testers to stay up to date, similarly to nightly builds, except that they don't have to redownload the entire mod every time we change something, they just have to pull. On the Steam workshop (provided we can solve that problem) updating isn't an issue. On the forums, however, this is a big roadblock to nightly updates.

Well, most testers would use the actually released versions, not a git clone or a nightly build (assuming a scenario where there'd be such a thing). But I'd hate to make harder the life of those few testers willing to work with the latest of the latest.

We could, alternatively, have a 'raw' repo, for such testing and convenience, but that would require more effort on our part, and that solution doesn't seem ideal.

We'd have to constantly merge into that repo. And we'd have the asset rebuilding issue there as well.

But, well, I don't know. From a purely technical point of view, not putting built assets in the repo would be best. But it'd certainly hinder testing by the average user. Maybe we could use a git hook to mark all files received in a git pull as up to date?

Link to comment
Share on other sites

@debugman18

Ok, here's the solution I came up with. I set up a git post-merge hook (run whenever a "git merge" is done, which is also done implicitly by a "git pull") that updates the timestamp of any zip or tex received (i.e., the zips and texs changed between the previous HEAD and the current HEAD).

To use it, put the post-merge script (inside the attached post-merge.zip) inside UpAndAway/.git/hooks, so that its path is UpAndAway/.git/hooks/post-merge (do not add any file extension, even though it's a bash script). And make sure it's executable, 'chmod +x'-ing if necessary (it shouldn't be, since zips preserve file attributes, but well :razz:). Then this script will be automatically run by git after a merge.

EDIT: updated the script to ensure it won't create any new files.

post-merge.zip

Link to comment
Share on other sites

@simplex

I received this error during compilation:

 

Scanning dependencies of target ktool_common
[  4%] Building CXX object CMakeFiles/ktool_common.dir/src/common/ktools_common.cpp.o
In file included from /Users/debug/ktools/src/common/ktools_common.cpp:1:
/Users/debug/ktools/src/common/ktools_common.hpp:157:28: error: default
      initialization of an object of const type 'const class Nil' requires a
      user-provided default constructor
        static const class Nil {} nil;
                                  ^
1 error generated.
make[2]: *** [CMakeFiles/ktool_common.dir/src/common/ktools_common.cpp.o] Error 1
make[1]: *** [CMakeFiles/ktool_common.dir/all] Error 2
make: *** [all] Error 2

Link to comment
Share on other sites

@simplex

I received this error during compilation:

 

Scanning dependencies of target ktool_common

[  4%] Building CXX object CMakeFiles/ktool_common.dir/src/common/ktools_common.cpp.o

In file included from /Users/debug/ktools/src/common/ktools_common.cpp:1:

/Users/debug/ktools/src/common/ktools_common.hpp:157:28: error: default

      initialization of an object of const type 'const class Nil' requires a

      user-provided default constructor

        static const class Nil {} nil;

                                  ^

1 error generated.

make[2]: *** [CMakeFiles/ktool_common.dir/src/common/ktools_common.cpp.o] Error 1

make[1]: *** [CMakeFiles/ktool_common.dir/all] Error 2

make: *** [all] Error 2

God damn it, it certainly does not require a user-provided default constructor :razz:. Not even Visual Studio complained about it, which is usually quite bad at auto-generating stuff.

Which compiler are you using (running "make VERBOSE=1" will show the compilation commands). If the compiler's name is just "c++" (/bin/c++ or similar) then run "c++ --version". I'm guessing it's clang.

Also, if instead of running ./configure you do

$ cmake . -DCMAKE_CXX_COMPILER=/bin/g++
does it work?

EDIT: I just pushed a "fix" for this, so the issue should be gone either way.

Link to comment
Share on other sites

God damn it, it certainly does not require a user-provided default constructor :razz:. Not even Visual Studio complained about it, which is usually quite bad at auto-generating stuff.

Which compiler are you using (running "make VERBOSE=1" will show the compilation commands). If the compiler's name is just "c++" (/bin/c++ or similar) then run "c++ --version". I'm guessing it's clang.

Also, if instead of running ./configure you do

$ cmake . -DCMAKE_CXX_COMPILER=/bin/g++
does it work?

EDIT: I just pushed a "fix" for this, so the issue should be gone either way.

 

 

That did the trick. And yes, it was clang.

Link to comment
Share on other sites

That did the trick. And yes, it was clang.

One would think the compiler would figure out how to generate a default constructor for an empty class, since there's nothing to be constructed :razz:. I'm pretty sure the standards-compliant behaviour would've been to do so, I don't get why clang didn't do it.

So, that was it? Did it compile and run cleanly on Mac after this fix?

Link to comment
Share on other sites

One would think the compiler would figure out how to generate a default constructor for an empty class, since there's nothing to be constructed :razz:. I'm pretty sure the standards-compliant behaviour would've been to do so, I don't get why clang didn't do it.

So, that was it? Did it compile and run cleanly on Mac after this fix?

 

It did, yes.

Link to comment
Share on other sites

I just updated the README. If you have any suggestion/complaints, please come forward :razz:.

And I also managed to compile a binary Windows release of ktech and krane with all dependencies bundled:

attachicon.gifktools-4.0-win32.zip

 

No complaints or anything, it checks out fine. :p

 

I tested converting an animation, and that worked fine as well. If I stumble across anything odd I'll be sure to let you know.

Link to comment
Share on other sites

I meant under Mac, as in earlier when I compiled it. Switching to Windows is a bit of a pain. :razz: I could probably test it later though.

Great! I already tested them under Windows (both under Windows itself and under Linux, using WINE), so the Mac test is much more valuable information. Thanks!

Link to comment
Share on other sites

@debugman18

I recompiled all anims with the latest mod tools. Let me know if anything is wrong.

I also made Make itself track when an animation needs to be recompiled, for uniformity in how our asset building is done and because the mod tools are a bit slow on that. To that end, I removed all spaces from image names used in Spriter projects (and tweaked the top of the scml files to accomodate the change). We shouldn't let spaces creep in, from now on :razz: (if that ends up being too much of a hassle, I'll revert the system to use the mod tools to check when the anim is up to date).

Link to comment
Share on other sites

@debugman18

I updated my post-merge git hook to first update the tex's, and then the zip's. This is to prevent the things in anim/ from being needlessly updated. I also added a small delay between each step. If you'd like to do the same, this is my current post-merge hook:

#!/bin/bashsleep 0.05echo "Updating timestamps of ZIPs and TEXs..."IFS=$'\n'for file in $(git diff-tree -r --name-only --no-commit-id HEAD@{1} HEAD); do	if [[ "$file" == *.tex ]]; then		touch -c "$file"	fidonesleep 0.05IFS=$'\n'for file in $(git diff-tree -r --name-only --no-commit-id HEAD@{1} HEAD); do	if [[ "$file" == *.zip ]]; then		touch -c "$file"	fidone
Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

Please be aware that the content of this thread may be outdated and no longer applicable.