If you want to help shape future policies and practices that enable hundreds of millions of search users find the trustworthy information they need and to take on a problem with real societal impact, this is the team you’re looking for! We are part of the Microsoft's Web Experiences Team.
On the broader internet misinformation, hateful, and otherwise unsafe content are continually being created. Our goal is to understand the ever-changing external landscape, translate that into product principles and policies, and then work to make search safer and more responsible. We accomplish this through policy, partnerships, operations, and technology including recent major advances in the fields of deep learning and natural language processing.
We're seeking a leader to expand and scale our responsible AI and safety partnerships. You'll build connections with industry, academia, and safety stakeholders to promote research, gain insights, and gather actionable data. You will work with other product managers and developers to make our products safer. As part of this role you will have input into the policies that shape the future of AI in search.
As a team, we leverage the diverse backgrounds and experiences of passionate engineers, scientists, and program managers to help us realize our goal of making the world smarter and more productive. We believe great products are built by inclusive teams of customer-obsessed individuals who trust each other and work together closely. We collaborate regularly across the company both to find the technology breakthroughs from groups like Microsoft Research to infusing AI into the rest of Microsoft products like Office and Azure.
Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond.