Abstract: Information asymmetries and power imbalances in Online Behavioral Advertising (OBA) have created circumstances involving lack of awareness and participation, insufficient understanding, misrepresentation, and limited action on the part of users. Existing transparency mechanisms offered by ad networks have been shown to be insufficient, and users have no way to directly monitor and control their own ad-targeting profiles. These problems primarily stem from the lack of transparency in OBA. However, increased transparency can have mixed effects. To address this problem, I draw upon theories of privacy as a control, contextual integrity, boundary regulation, and newly introduced regulations on solely automated decisions, to design and deploy a technology probe which gives users in-situ (e.g., in the moment) and real-time awareness and control of ad-targeting profile composition. The goal of my research is to explore questions of increased user participation, feasibility, and future designs of transparency tools for OBA. The technology probe attempts to reverse-engineer inferencing and profiling from ad networks in order to give users longitudinal visibility and control of their potential ad-targeting profiles originating from online and offline activities. My field deployment of the probe will elicit user experimentation and insight into the design of future OBA transparency, and explore transparency design alternatives involving increased user participation.