The U.S. Senate just introduced the scariest bill Americans have ever seen

Michael Bennet

The Washington, D.C. Swamp has all but taken control of American life. Now they’re trying to make it official.

Because the U.S. Senate just introduced the scariest bill Americans have ever seen.

Democratic Senator Michael Bennet has introduced legislation that would give a brand-new federal agency the authority to create a council to determine “enforceable behavioral codes” for artificial intelligence and social media platforms.

Experts in “disinformation” would also be included on the council’s panel.

To counteract “hate speech” and “misinformation,” the Senate has introduced legislation to create a federal institution with the responsibility of monitoring public dialogue around the clock.

The senator from Colorado, with backing from Peter Welch (D-Vt.), introduced the Digital Platform Commission Act (DPCA) last week. A “Ministry of Truth” under the guise of the Federal Digital Platform Commission would be established.

According to the proposed framework, the agency would be led by a panel of five commissioners selected by Vice President Joe Biden and subsequently confirmed by the Senate.

Anyone who disagrees with the official line of the party in power at the time—in this case, the Democrats—would apparently face legal fines from the Federal Digital Platform Commission.

However, civil rights attorney Harmeet Dillon has stated that the bill will unquestionably violate the First Amendment. It’s “Unconstitutional, also evil and stupid,” she tweeted on Friday.

After the ODNI announced the formation of the DCPA, the Foreign malicious Influence Center (FMIC) was established to deal with foreign malicious influence in U.S. elections and “public opinion within the United States.”

According to DNI Director Avril Haines, the FMIC would work with the State Department’s Global Engagement Center (GEC) to achieve its goal of information domination, which the “Twitter Files” revealed operates as a sinister censorship arm of the federal government.

A year ago, in response to widespread public concern, the Department of Homeland Security dissolved its “Disinformation Governance Board” (DGB).

Government documents suggest that the DGB was tasked with detecting and silencing speech that posed “serious homeland security risks,” such as “conspiracy theories” about election fraud, the effects of the COVID-19 vaccination, and “the efficacy of masks.”

Bennet defended the request for such regulation in his statement by claiming that unrestrained “technology” was harming children and eroding democracy.

Bennet, a current member of Congress, did not come across as very confident in his institution. He claimed that Congress cannot keep up with the rapid changes brought forth by technological advancements. His comments said he did not have faith in the Federal Trade Commission or the Department of Justice to provide “robust and sustained” oversight of the digital platform industry due to a lack of funding and trained personnel.

Consumer protection from “addictive design features or harmful algorithmic processes” is cited as an objective in the announcement.

Bennet went on to add that under his and Welch’s concept, all animals are equal, but some are arguably more equal than others. As a result, the Commission would be able to designate “systemically important” digital platforms for further oversight and regulation, such as audits and the “explainability” of algorithmic judgments, due to their potential impact on the economy as a whole.

To ensure accountability and transparency in algorithmic processes, among other things, the law recommends establishing a Code Council within the Commission to draft relevant behavioral norms, technological standards, and other policies.

According to reports, the Council would have 18 members, including representatives from digital platforms or their associations (at least three of whom must come from platforms deemed “systemically important”).

The bill’s introduction, in which the authors go to considerable lengths to justify the need to establish the Commission, reflects the same tangle of incompatible concepts and concerns that make up the Council’s definition.

It is said to be a solution to problems like the destruction of “trusted” local journalism, the promotion of addiction, the dissemination of hate speech, the theft of personal information for profit, the radicalization of individuals, and the free expression of racism and sexism.