Is there a way to obtain live logs in the local testing environment? Currently, the logs are dumped after the execution is completed - it would be nice to have these logs generated in livetime instead. I understand that in “/runtime/entrypoint.sh”, one may add --live stream to the line “conda run -n py python main.py” (even though this won’t be a possible step for the actual submission), but I think the docker image we pulled won’t reflect this change. Is it possible then that the repo’s dockerfile can be built with the --live stream attribute appended in entrypoint.sh?
Any suggestions would be greatly appreciated, thanks!
This sounds like a Docker timing issue about when output gets output to the console running the container, but the Python process within the container pipes STDOUT and STDERR to log.txt in the submission folder.
You should be able to tail -f submission/log.txt and watch as files are written out.
This looks like it’ll work, just wondering if this is different than looking at log.txt in the submission folder which should already be working in the same way?
I think loguru is logging into stdout and later at end of the execution that is copied to log.txt. Where in above approach logging is done directly to the file. There should be way to do similar even with loguru. May be here is loguru equivelant loguru._logger.Logger.add for this.
What worked for me was adding the attribute --no-capture-output to the conda run statement for main.py in entrypoint.sh in the runtime folder, and then building a local docker image with make test-container. Sure, we won’t be able to do this over the actual submission website, but for local testing purposes, this works well. The issue is that entrypoint.sh is invoking the python script main.py (and everything the latter calls) in the background, and it is streaming to stdout and stderr instead of the terminal. We don’t want that to happen, and therefore, we want to add the attribute --no-capture-output to the conda run statement for main.py in entrypoint.sh.