Final Submissions Due Sunday

You’re already aware of this, but just to be official about things-- Final submissions (with executable, source code, final write-up/proof, and code-guide) are due into to the prescreened leaderboard this sunday, 11/15. It’s a lovely weekend for privacy-preserving data science!

Let us know if you have any last minute questions, or run into any difficulty submitting. I’ll check in on this thread periodically over the weekend.

To confirm, the deadlines for submission is on Sunday, 8 PM EST time (NewYork time). Is that correct ?
In case our methods differs with the previous submission, we need to submit both new write-up and new executable codes. We will try to submit new write-up clear, but in case there are some parts that are unclear, we still can explain (via email) after the deadline time. Is that correct.

Hi @cuongk14 - Thanks for confirming. The deadline is indeed 8 PM EST on 11/15/20. Regardless of whether your approach has changed from your pre-screening submission, your final scoring submission should include everything outlined in the Final Scoring section of the Problem Description.

Reviewers may follow up for clarifications if needed, but this is not required and you should make your materials as clear and complete as possible as a self-contained submission.

Good luck!

Our solution requires an extra dependency, we made a pull request on GitHub yesterday, but when can we expect that to be accepted/allow us to submit a valid submission to prescreened arena?

1 Like

We have a similar situation for our solution.
I believe the Github repository instructions were for local testing only and does not have anything to do with the final contest code submission.
However, the prescreened code .zip does not ask for the py-cpu.yml file, and it is confusing how our codes will run in their environment.

@rmckenna The pull request has been accepted.

The Docker environment that your code runs in is identical to that created from the master branch of the repo. That means if a certain Python requirement needs to be installed in the environment but is not in the .yml file, you will need to submit a pull request to get it changed.

Let’s assume you need package “statsmodels” but it’s not already in the .yml file. Obviously you can just make changes locally and rebuild your local version of the Docker container to get things working on your machine. But when your code tries to import statsmodels in the actual evaluation environment that we are running, it will fail because that isn’t in our official image yet. To get it into our official image you would submit a PR like this one on GitHub.

Does that make sense?

Yup. That makes sense.
I have added a pull request to the repository.

@mshubhankar Hm, I don’t see any open pull requests –

The submission portal is currently closed. Will it reopen until 8 PM EST?

Where are we supposed to make our final submissions? The prescreened leaderboard says that it’s closed.

Hey all, apologies on that, we’ll get it sorted promptly!

@yuchaotao @zschutzman Check now - should be fixed!

Hi, we’re wondering how we would submit our proof of privacy, would it be separate from the containerized source code or would everything just be submitted through the prescreened arena?

@joiewu Please check the instructions. Everything should be submitted through the Prescreened Arena.

And don’t forget to include a code guide along with your updated privacy proof (see the overview of final submission contents here: Preparing for the Nov 9th Sprint #1 Final Scoring Invitation )

It didn’t go through the last time. You should see it now.

Should the code guide be inside the privacy write up, or should it be separate file? Asking because the final submission instructions listed here, it states the former. But in the instructions posted on the forum, it states the later.

Either one is just fine, as long as you get the required information covered someplace and we can find it easily.

@mshubhankar PR has been accepted. The updated environment should be live in a few minutes.

1 Like