Learning to Search More Deeply

Project Summary

Weiyao Wang (Math) and Jennifer Du , along with NCCU Physics majors Jarrett Weathersby and Samuel Watson, spent ten weeks learning about how search engines often provide results which are not representative in terms of race and/or gender. Working closely with entrepreneur Winston Henderson, their goal was to understand how to frame this problem via statistical and machine-learning methodology, as well as to explore potential solutions.

Themes and Categories
Year
2016

Project Results

In order to understand Google's algorithm, the team web-scraped search results and used machine-learning to understand the importance of each feature. They then performed sentiment analysis to quantify public opinions from Twitter and used community-based crawling and seeding to collect information relevant to minority groups.

Download the Executive Summary (PDF)

Faculty Sponsor

Project Manager

Participants

  • Jennifer Du, Duke University Computer Science
  • Weiyao Wang, Duke University Computer Science, Mathematics, and Political Science
  • Jarrett Weathersby, North Carolina Central University Physics
  • Samuel Watson, North Carolina Central University Physics

Disciplines Involved

  • Sociology
  • Anthropology
  • Economics
  • All quantitative STEM

Related People

Related Projects

Social and environmental contexts are increasingly recognized as factors that impact health outcomes of patients. This team will have the opportunity to collaborate directly with clinicians and medical data in a real-world setting. They will examine the association between social determinants with risk prediction for hospital admissions, and to assess whether social determinants bias that risk in a systematic way. Applied methods will include machine learning, risk prediction, and assessment of bias. This Data+ project is sponsored by the Forge, Duke's center for actionable data science.

Project Leads: Shelly Rusincovitch, Ricardo Henao, Azalea Kim

Project Manager: TBD

Producing oil and gas in the North Sea, off the coast of the United Kingdom, requires a lease to extract resources from beneath the ocean floor and companies bid for those rights. This team will consult with professionals at ExxonMobil to understand why these leases are acquired and who benefits. This requires historical data on bid history to investigate what leads to an increase in the number of (a) leases acquired and (b) companies participating in auctions. The goal of this team is to create a well-structured dataset based on company bid history from the U.K. Oil and Gas Authority; data which will come from many different file structures and formats (tabular, pdf, etc.). The team will curate these data to create a single, tabular database of U.K. bid history and work programs.

Producing oil and gas in the Gulf of Mexico requires rights to extract these resources from beneath the ocean floor and companies bid into the market for those rights. The top bids are sometimes significantly larger than the next highest bids, but it’s not always clear why this differential exists and some companies seemingly overbid by large margins. This team will consult with professionals at ExxonMobil to curate and analyze historical bid data from the Bureau of Ocean Energy Management that contains information on company bid history, infrastructure, wells, and seismic survey data as well as data from the companies themselves and geopolitical events. The stretch goal of the team will be to see if they can uncover the rationale behind historic bidding patterns. What do the highest bidders know that other bidders to not (if anything)? What characteristics might incentivize overbidding to minimize the risk of losing the right to produce (i.e. ambiguity aversion)?