New technologies often create unanticipated problems. Social media has been blamed for creating lots of behavioral problems, like online addiction and attention deficit disorder, but it has also precipitated deep social issues that have challenged our core values of freedom and privacy. In response, tech companies have implemented various policies centered around controlling the flow of information through their systems, which in turn has created yet more challenges to freedom by centralizing information control in the hands of a few private companies.

The US government got involved after Facebook was accused of sharing its users’ private personal information with third parties, and also providing a propaganda platform for Russian operatives to exert political campaign influence on the domestic front. In his testimony to congress, Facebook CEO Mark Zuckerberg explained to the commission that he and his company take these issues very seriously, and they invest much money and effort protecting privacy and preventing political interference.

There are no easy remedies for this emerging social media problem, since tech firms are unfit to unilaterally judge social issues, and no regulating body is technically fit for the job. Facebook, Twitter and other tech companies, being private for-profit enterprises, certainly shouldn’t be trusted to faithfully manage key social issues related to privacy, politics, diplomacy, free speech and social justice. The government, on the other hand, cannot be expected to understand the technical ramifications of implementing oversight and regulation in all these cases. Besides, it’s impossible to regulate information on targeted platforms because as soon as one platform comes under oversight and government control, a competitor without oversight will probably draw more users due to its ability to freely innovate.

Randi Zuckerberg, speaking on the topic of her brother’s remarks about Holocaust deniers having a right to free speech on Facebook, wrote, “I also don’t want to live in a world where tech companies get to decide who has the right to speech and get to police content in a way that is different from what our legal system dictates.”

The only way forward may involve a coordinated effort between tech companies and lawmakers to define and enforce clear demarcations between free speech and hate speech, between privacy and public engagement and between legal use and illegal activity. An important aspect of such a mutual effort would include segmenting between the responsibilities of the tech companies, given their core competencies, versus the responsibilities of government and law enforcement.

We can learn something from the focused fiduciary responsibilities of lawyers and doctors. “Lawyers know the importance of professional specialization and working within the area of their expertise,” says Laurence B. Green, attorney and founder of Berger and Green. We don’t expect or allow lawyers to define the law for themselves, and we don’t allow doctors to experiment with new pharmaceuticals on their own; we rely on the work of Congress and the FDA, respectively, to define, regulate and enforce these policies.