Showing posts with label crowd sourcing. Show all posts
Showing posts with label crowd sourcing. Show all posts

Sunday, April 17, 2016

Crowdsourcing City Government, by Glaeser, Hillis, Kominers, and Luca


Crowdsourcing City Government:Using Tournaments to Improve Inspection AccuracyBy Edward L. Glaeser, Andrew Hillis, Scott Duke Kominers, and Michael Luca

The Papers and Proceedings version doesn't have an abstract, but here's one from the NBER working paper:

Can open tournaments improve the quality of city services? The proliferation of big data makes it possible to use predictive analytics to better target services like hygiene inspections, but city governments rarely have the in-house talent needed for developing prediction algorithms. Cities could hire consultants, but a cheaper alternative is to crowdsource competence by making data public and offering a reward for the best algorithm. This paper provides a simple model suggesting that open tournaments dominate consulting contracts when cities have a reasonable tolerance for risk and when there is enough labor with low opportunity costs of time. We also illustrate how tournaments can be successful, by reporting on a Boston-based restaurant hygiene prediction tournament that we helped coordinate. The Boston tournament yielded algorithms—at low cost—that proved reasonably accurate when tested “out-of-sample” on hygiene inspections occurring after the algorithms were submitted. We draw upon our experience in working with Boston to provide practical suggestions for governments and other organizations seeking to run prediction tournaments in the future.

Thursday, December 1, 2011

What does the NSF do? What should it do? Reports from and about the Social, Behavioral and Economic Sciences, and Dec 1 Webinar

What should the National Science Foundation division of Social, Behavioral and Economic Sciences be doing? They asked and we answered, and now they're having a webinar to report the results: here's the email announcement.

Dear Colleague:

Just a year ago, we stopped accepting SBE 2020 white papers.  The papers were released to the public in February and now we have completed a report, Rebuilding the Mosaic, which briefly describes the process, some of the themes we identified, and the programmatic implications of what we learned.  The report is available at: http://www.nsf.gov/sbe/sbe_2020/index.cfm, and we expect to host a webinar/town hall on December 1.  The login details are below.

All of your papers contributed to our thinking about the future of research in the SBE sciences, and we continue to be amazed at and grateful for your participation.  I hope that you will take a moment to read the report – all of the papers are listed in Appendix 5.  For the foreseeable future, we also expect to maintain the website (http://www.nsf.gov/sbe/sbe_2020/index.cfm), where the papers can be individually found and downloaded, since the report cannot substitute for the many ideas that you have shared with us and with the American people.

Although I have written to you before to express my appreciation, one more time, let me say:

Thank you.

Myron Gutmann

Directorate for the Social, Behavioral, and Economic Sciences
National Science Foundation
Details for participating in the webcast:



Date: December 1 at 11 a.m.
Webcast Title: Rebuilding the Mosaic: Listening to the Future in the SBE Sciences

Dial-in phone number:  888-469-1936
Verbal Passcode: Mosaic


Webcast URL:  http://live.science360.gov/    (will be active on Dec. 1.)
Webcast username: webcast
Webcast password: mosaic (case sensitive)



*********
Earlier, in a statement to Congress, Dr. Gutmann highlighted some of the tangible benefits derived from market design work that the NSF has supported:



"3.1 SBE research has resulted in measurable gains for the U.S. taxpayer
Matching markets and kidney transplants. Researchers in economics at Harvard University, the University of Pittsburgh, and Boston College have applied economic matching theory to develop a system that dramatically improves the ability of doctors to find compatible kidneys for patients on transplant lists. Organ donation is an example of an exchange that relies on mutual convergence of need. In this case, a donor and a recipient. This system allows matches to take place in a string of exchanges, shortening the waiting time and, in the case of organ transplants, potentially saving thousands of lives.10 Similar matching markets exist in other contexts, for example, for assigning doctors to residencies.
Spectrum auctions. Spectrum auctions have generated $54 billion for the U.S. Treasury between 1994 and 2007 and worldwide revenues in excess of $200 billion. Researchers at Stanford University and the California Institute of Technology, supported by grants from SBE, developed the simultaneous ascending auction mechanism as a technique for auctioning off multiple goods whose values are not fixed but depend on each other. The mechanism was then tested experimentally and further refined before being implemented by the Federal Communications Commission. In this auction, all of the goods are on the selling block at the same time, and open for bids by any bidder. By giving bidders real-time information on the tentative price at each bid stage, bidders can develop a sense for where prices are likely to head and adjust their bids to get the package of goods they want. This process enables "price discovery," helping bidders to determine the values of all possible packages of goods. These auctions not only raise money, but ensure efficient allocation of spectra so that the winners of the auction are indeed the individuals who value the spectra the most. Applied with great benefit for the U.S. taxpayer in the FCC spectrum auctions, this method has also been extended to the sale of divisible goods in electricity, gas, and environmental markets.11"
*****************
Here's an earlier post on congressional testimony:

NSF Social, Behavioral and Economic Sciences--attack and defense

Saturday, February 12, 2011

Will reputation and crowd sourcing facilitate alternative forms of peer review?

That's the question raised in a (gated) article in the Chronicle of Higher Education about a proposal to publish papers online, and then have them subject to comment: 'Facebook of Science' Seeks to Reshape Peer Review

"Mr. Tracz plans to turn his latest Internet experiment, a large network of leading scientists called the Faculty of 1000, into what some call "the Facebook of science" and a force that will change the nature of peer review. His vision is to transform papers from one-shot events owned by publishers into evolving discussions among those researchers, authors, and readers.
...
"The core function of F1000 is to allow members to highlight any newly published paper that they consider interesting and give it a points rating of six (recommended), eight (must read), or 10 (exceptional). Many members give network access to a junior colleague who helps them rate publications.


"Members say in a sentence or two why they find the paper interesting. Readers then are able to attach their own comments to the F1000 site. (Authors can appeal comments they consider unreasonable.)
...
"For Mr. Tracz, this objective leads inevitably back to the more grandiose goal of upending the existing publishing system. "There are two big issues, for science and for publishing," he says. "One is peer review, and one is the publishing of data." While many researchers and publishers consider prepublication peer review to be, at worst, a necessary evil, Mr. Tracz is scathing about its weaknesses. "Except for a tiny little part at the top, where it is done seriously, peer review has become a joke. It is not done properly, it delays publication unnecessarily, it is open to abuse, and is being abused. It is seriously sick, and it has been for a while."

Tuesday, August 17, 2010

"Social" science

A novel kind of crowd-sourcing is described in: In a Video Game, Tackling the Complexities of Protein Folding

"Proteins are essentially biological nanomachines that carry out myriad functions in the body, and biologists have long sought to understand how the long chains of amino acids that make up each protein fold into their specific configurations.

"In May 2008, researchers at the University of Washington made a protein-folding video game called Foldit freely available via the Internet. The game, which was competitive and offered the puzzle-solving qualities of a game like Rubik’s Cube, quickly attracted a dedicated following of thousands of players.

"The success of the Foldit players, the researchers report in the current issue of Nature, shows that nonscientists can collaborate to develop new strategies and algorithms that are distinct from traditional software solutions to the challenge of protein folding.

"The researchers took pains to credit the volunteers who competed at Foldit in the last two years, listing “Foldit players” at the end of the report’s author list and noting that more than 57,000 players “contributed extensively through their feedback and gameplay.” "