Fri, Jun 18
Fri, Jun 4
Thanks @Htriedman for the deep dive. A few notes on our tooling, in case this helps:
Apr 30 2021
Side-note inspired by a remark from my colleague Mirac: given that the sensitivity is fixed to 1 in the code (which makes sense if using a pageview as the privacy unit), what is the meaning of setting it to a different value in the web UI?
I love the demo, congrats for your work on this!
Jan 25 2021
Ah, I see. Yeah, no anonymization notion can change the fact that some data might be available without any guarantee because of some auxiliary release.
Jan 20 2021
@Isaac, that's an amazing demo! I love this, thanks for your work and thoughtful analysis =)
Nov 6 2020
potentially might allow the release of data from years back with ability to do re-runs
Open question: Would using a fingerprint user-identifier based on user-agent and IP as privacy-unit? I can't picture what would be the downside of such an approach.
It seems that next steps are to do some tryouts exploring the idea of "a single view" as a privacy guarantee and try the end to end tool for DP (thanks for releasing that!). It might not get to this for a month due to personal reasons but it is on my radar.
Nov 5 2020
This is super helpful, thank you. I see there is a view_count field there. Do I understand correctly that it means that the data was already aggregated? Say that field has value 5, does it means that the page had 5 different views, potentially from 5 different users (but all with the same country, user_agent_map, etc.)? If so, what is the typical distribution of values for view_count?
Hi all, I'm one of the folks behind https://github.com/google/differential-privacy. We'd love to help you with your use case =)