Long story short, the CPUC decided not to give data to some users rather than to adopt a privacy standard that would have allowed those users to get useful data.
It's a long post, well worth reading, about what went wrong and what could have been done better. I'll just summarize some of his subject headings, as he thinks about how he'll go about this in the future, in the second part of his post, called On practicing differential privacy:
Focus on win-win applications
"Apply differential privacy as a tool to provide access to data where currently access is problematic due to privacy regulations. Don’t fight the data analyst. Don’t play the moral police. Imagine you are the analyst
Don’t empower the naysayers
"for differential privacy to be a major success in practice it would be sufficient if it were successful in some applications but certainly not in all—not even in most.
Change your narrative
"Don’t present differential privacy as a fear inducing crypto hammer designed to obfuscate data access. That’s not what it is. Differential privacy is a rigorous way of doing machine learning, not a way of preventing machine learning from being done.
Build reliable code repositories
"A weakness of the differential privacy community has been the scarcity of available high quality code.
Be less general and more domain-specific
"... reading the scientific literature on differential privacy from the point of view of a domain expert can be very frustrating. Most papers start with toy examples that make perfect sense on a theoretical level, but will appear alarmingly naïve to a domain expert.
Be more entrepreneurial
"The CPUC case highlighted that the application of differential privacy in practice can fail as a result of many non-technical issues. These important issues are often not on the radar of academic researchers.
So, is differential privacy practical?
"I like the answer Aaron Roth gave when I asked him: It's within striking distance."