Tag Archives: tiktok

U.S. Lawmakers Unveil Bill To Ban TikTok In The U.S.



A new bill from a bipartisan group of lawmakers, if passed, would ban TikTok in the U.S. after years of broad concern across the Trump and Biden administrations about potential Chinese government influence on the company, CNBC reported.

TikTok, owned by Chinese company ByteDance, has raised fears in the U.S. that Chinese government officials could gain access to U.S. user data under Chinese law that could compel the company to hand over information, CNBC reported. TikTok has insisted U.S. user data is safely stored outside of China, which it says would keep it out of reach of government officials.

According to CNBC, the Committee on Foreign Investment in the U.S. is in talks with the company about how to resolve some of the data concerns, though a solution has reportedly been delayed. FBI Director Christopher Wray testified before Congress that he’s “extremely concerned” about the Chinese government’s potential influence through TikTok on U.S. users.

Senator Marco Rubio (Republican – Florida) introduced bipartisan legislation to ban TikTok from operating in the United States. U.S. Representatives Mike Gallagher (Republican – Wisconsin) and Raja Krishnamoorthi (Democrat – Illinois) included companion legislation in the U.S. House of Representatives.

The legislation is titled: “Averting the National Threat of Internet Surveillance, Oppressive Censorship and Influence, and Algorithmic Learning by the Chinese Communist Party Act (ANTI-SOCIAL CCP Act)”. The description of the legislation states that it would protect Americans by blocking and prohibiting all transactions from any social media company in, or under the influence of, China, Russia, and several other foreign countries of concern.

ArsTechnica reported that the ANTI-SOCIAL CCP Act is designed to block and prohibit all transactions by social media companies controlled or influenced by “countries of concern.” The legislation specifically names TikTok and owner ByteDance as existing as national security threats.

According to ArsTechnica, if the legislation is passed, its provisions would also extend to any social media platform controlled by other U.S. foreign adversaries, including Russia, Iran, North Korea, Cuba, and Venezuela.

Engadget reported that while the sponsors of the bill characterize the measure as bipartisan, it’s not clear the call for a TikTok ban has enough support to clinch the necessary votes and reach Biden’s desk. To some degree, Engadget wrote, the ANTI-SOCIAL CPP Act is more a signal of intent than a practical way to block TikTok.

There is no way to know, for certain, whether or not this bill will become law. Personally, I think it is a good idea to prevent lawmakers from having TikTok on their devices, especially if there are valid concerns about TikTok collecting data through its app.


Texas Bans TikTok On Government Devices



Texas Governor Greg Abbott on Wednesday ordered state agencies to ban TikTok on government-issued devices, citing security concerns of the app’s data-sharing practices with the Chinese Government, The Guardian reported.

“TikTok harvests vast amounts of data from its users’ devices – including when, where and how they conduct Internet activity – and offers this trove of potentially sensitive information to the Chinese government,” according to one of the letters the governor sent to state agency leaders.

“While TikTok has claimed that it stores US data within the US, the company admitted in a letter to Congress that China-based employees can have access to US data. It has also been reported that ByteDance planned to use TikTok location information to surveil individual American citizens,” it added.

According to The Guardian, the letter also cited China’s 2017 National Intelligence Law, stating that businesses are required to assist China in intelligence work including data sharing. It recounted that TikTok’s algorithm had already censored topics politically sensitive to the Chinese Communist Party, including the Tiananmen Square protests.

Texas isn’t the only state that has become wary of TikTok. The New York Times reported that Indiana’s attorney general on Wednesday sued the Chinese-owned app TikTok for deceiving users about China’s access to their data and for exposing children to mature content, in the first state lawsuits against the popular video service.

According to The New York Times, Todd Rokita, Indiana’s attorney general, claimed that TikTok, which is owned by the Chinese company ByteDance, violated state consumer protection laws by failing to disclose the Chinese government’s ability to tap sensitive consumer information. His office said in a separate complaint that TikTok deceived young users and their parents with its age rating of 12-plus in Apple’s and Google’s app stores, when in fact inappropriate sexual and substance-related content can be easily found are are pushed by the company to children using the app.

The Hill reported that both of the lawsuits from Indiana and Texas seek to prevent TikTok from continuing its allegedly deceptive practices and demand civil penalties of up to $5,000 per violation.

Texas and Indiana aren’t the only states who are cracking down on TikTok. The Guardian reported that governor Abbott’s orders follow in the footsteps of Maryland governor Larry Hogan, who on Tuesday also ordered the ban of TikTok and several other China and Russia-based platforms in the state’s executive government branch. Wisconsin’s Republican representatives in Congress on Tuesday called for governor Tony Evers to delete TikTok from all state government devices, calling it a national security threat.

The Verge reported that Republican governors North Dakota and South Dakota have also banned the use of TikTok on governmental devices. The Army, Navy, and Departments of Homeland Security and State have also banned use of the app on government-issued devices.

In short, it appears that several states have started banning TikTok on government devices, and two states have filed lawsuits against TikTok. My advice to content creators who primarily post on TikTok is to find a different platform to post their videos on, just in case TikTok gets banned.


TikTok Raises Age Requirement For Going LIVE



TikTok is updating its livestream system to limit kids from going live and to allow streamers to only reach adults, The Verge reported. According to The Verge, TikTok currently has allowed those 16 and older to stream live. The company is changing that to 18 and older.

TikTok posted news titled: “Enhancing the LIVE community experience with new features, updates, and policies” (On October 17). It includes the following:

The foundation of TikTok is built on community trust and safety. To protect our users and creators and support their well-being, we constantly work to evolve the safeguards we put in place. Today, we’re making additional changes and improvements to help our community have the best experience possible when they use LIVE.

Currently, people must be 16 or over to host a LIVE. From November 23, the minimum age will increase from 16 to 18. As we consider the breadth of our global audience, we already take a graduated approach to the features that our community can access based on their age; younger teens need to be 16 or older to access Direct Messaging and 18 or older to send virtual gifts or access monetization features.

This news might be disappointing to TikTok users who are not yet 18-years-old. They will, eventually, gain access to the LIVE feature when they get older. Waiting two years (or more) for access might seem like a very long time, though.

There is another change TikTok is making that could perhaps have influenced why they are limiting LIVE to those who are 18 or older. From TikTok’s news:

In addition, in the coming weeks, we plan to introduce a new way for creators to choose if they’d prefer to only reach an adult audience in their LIVE. For instance, perhaps a comedy routine is better suited for people over age 18. Or, a host may plan to talk about a difficult live experience and they would feel more comfortable knowing the conversation is limited to adults. We want our community to make the most of the opportunities LIVE can bring without compromising safety. We believe these industry-leading updates can further protect the younger members of our community as they start and build their online presence.

To be clear, it does not sound as though TikTok is expecting people to turn their LIVE into something like OnlyFans. I suspect some people might attempt to do that, though. As such, it makes a lot of sense for TikTok to prevent people under the age of 18 from accessing LIVE, while also giving the adults on TikTok a way to weed out those who are underage from watching their LIVE.


TikTok Launches “Elections Center” To Combat Misinformation



TikTok announced its midterms Election Center will go live in the app in the U.S. starting August 17, 2022, where it will be available to users in more than 40 languages, including English and Spanish, TechCrunch reported.

According to TechCrunch, the new feature will allow TikTok users to access state-by-state election information, including details on how to register to vote, how to vote by mail, how to find your polling place and more, provided by TikTok partner NASS (the National Association of Secretaries of State).

TikTok also newly partnered with Ballotpedia to allow users to see who’s on their ballot, and is working with various assistance programs – including the Center for Democracy in Deaf America (for deaf voters), the Federal Voting Assistance Program (overseas voting), the Campus Vote Project (students) and Restore Your Vote (people with past convictions) – to provide content for specific groups. The AP will continue to provide the latest election results in the Elections Center.

TikTok posted a Safety post titled: “Our commitment to election integrity”. It was written by Eric Han, Head of US Safety. From the post:

At TikTok, we take our responsibility to protect the integrity of our platform – particularly around elections – with the utmost seriousness. We’re proud to be a place that brings people together over creative and entertaining content, and we work hard to keep harmful misinformation and other violations of our policies off our platform. As the US midterm elections continue, we’re sharing more on the work we’re doing to protect our community during this time.

Here are some things TikTok says it will do:

Promoting digital literacy skills and education. TikTok says its in-app center will feature videos that encourage our community to think critically about content they see online, as well as information about voting in the election.

Users will be directed away from TikTok for any action that requires a user to share information, such as registering to vote. Users will be directed way from TikTok onto the website for the state or relevant non-profit in order to carry out that process. TikTok will not have access to any of that off-platform data or activity.

TikTok will also add labels to content identified as being related to the 2022 midterm elections as well as content belonging to governments, politicians, and political parties in the US. These labels will allow viewers to click through to TikTok’s center and get information about the elections in their state.

TikTok will provide access on popular elections hashtags, like #elections2022 and #midtermelections, so that anyone searching for that content will be able to easily access the center. Users can also use TikTok’s tools to automatically filter our videos with words or hashtags they don’t want to see in their For You or Following feeds.

It appears that TikTok is actually going to put some effort into preventing its site from becoming a quagmire of political misinformation. TikTok appears to have done its homework and connected with reliable sources of political information. My hope is that these efforts will work. Unfortunately, it is not unheard of for users of a social media site to get angry whenever something is put in place that prevents them from easily spreading election misinformation.


Social Media Companies Killed A California Bill To Protect Kids



California lawmakers killed a bill Thursday that would have allowed government lawyers to sue social-media companies for features that allegedly harm children by causing them to become addicted, The Wall Street Journal reported.

According to The Wall Street Journal, the measure would have given the attorney general, local district attorneys and city attorneys in the biggest California cities authority to try to hold social-media companies liable in court for features that knew or should have known could addict minors. Among those targeted could have been Facebook and Instagram parent Meta Platforms, Inc., Snapchat parent Snap Inc., and TikTok, owned by Chinese company ByteDance Ltd.

In June of 2022, Meta (parent company of Facebook and Instagram) was facing eight lawsuits filed in courthouses across the US that allege that excessive exposure to platforms including Facebook and Instagram has led to attempted or actual suicides, eating disorders and sleeplessness, among other issues. More specifically, the lawsuits claim that the company built algorithms into its platforms that lure young people into destructive behavior.

The Wall Street Journal also reported that the bill died in the appropriations committee of the California state senate through a process known as the suspense file, in which lawmakers can halt the progress of dozens or even hundreds of potentially controversial bills without a public vote, based on their possible fiscal impact.

The death of the bill comes after social media companies worked aggressively to stop the bill, arguing that it would lead to hundreds of millions of dollars in liability and potentially prompt them to abandon the youth market nationwide. Meta, Twitter Inc., and Snap all had individually lobbied against the measure according to state lobbying disclosures.

This doesn’t mean that a similar bill cannot be passed by the federal government. Politico reported earlier this month that the Commerce Committee advanced the floor considerations for two bills: It approved the Children and Teens’ Online Privacy Protection Act on a voice vote and the Kids Online Safety Act by a unanimous 28-0.

According to Politico, The Kids Online Safety Act was co-sponsored by Richard Blumenthal (Democrat – Connecticut) and Marsha Blackburn (Republican – Tennessee). That bill, if passed, would require social media platforms to allow kids and their parents to opt out of content algorithms that have fed them harmful content and disable addictive product features.

The Children and Teens’ Online Privacy Protection Act was sponsored by Bill Cassidy (Republican – Louisiana) and Ed Markey (Democrat – Massachusetts). That bill, if passed, would extend existing privacy protections for preteens to children up to age 16 and bans ads from targeting them. It would also give kids and their parents the right to delete information that online platforms have about them.

Personally, I think that parents of children and teenagers who have allowed their kids to use social media should have complete control over preventing the social media companies from gathering data on their children. Huge social media companies need to find other ways of sustaining revenue that doesn’t involved mining underage people in the hopes of gaining money from ads.


TikTok Announces Top Performing Videos Can Become Ads



TikTok announced that it’s launching a new ad product called “Branded Mission” that will allow creators to connect with brands and possibly receive rewards for videos, TechCrunch reported. According to TechCrunch, with this new ad product, advertisers can crowdsource content from creators and turn top-performing videos into ads.

TikTok posted in its newsroom more information about “Branded Mission”. From the post:

…To make it easier for brands to tap into the creative power of TikTok communities and co-create authentic branded content that resonates with users, we’re launching Branded Mission. Branded Mission is an industry-first ad solution that enables advertisers to crowdsource authentic content from creators on TikTok, turn top-performing videos into ads, and improve brand affinity with media impressions.

According to TikTok, this new form of two-way engagement between brands and creators enables the TikTok community to have a creative hand in the ads that are part of a brand campaign and helps brands discover emerging creators across TikTok.

By using Branded Mission, advertisers can:

Engage the community to participate in branded campaigns: Brands can develop a brief and release it to the creator community to participate in the Branded Mission.

Let creators tell the most relatable brand story in an authentic way: TikTok creators can decide what Branded Missions they are inspired by and choose to participate in the Mission. Brands will select their favorite original creative videos and amplify them through promoted ad traffic.

Discover a diverse ecosystem of creators who are the main drivers of culture on TikTok: Brands now have more opportunities to discover and engage with a broad ecosystem of creative and talented creators. Creators who are at least 18 years old with more than 1,000 followers will be eligible to participate in a Branded Mission.

According to TikTok, eligible creators whose videos are selected by the brand as ads will benefit from a cash payment and boosted traffic. On each Branded Mission page, creators will see the potential earning opportunity before choosing to participate.

It is worth noting that the Federal Trade Commission (FTC) makes it clear that creators have the responsibility to disclose that their content is an advertisement – not the brands responsibility.

“If you endorse a product through social media, your endorsement message should make it obvious when you have a relationship (“material connection”) with the brand. A “material connection” to the brand includes a personal, family, or employment relationship or a financial relationship – such as the brand paying you or giving you free or discounted products or services.”

I think it is a good idea for TikTok to enable a connection between brands and creators. I like that the brands have to be upfront about how much they are willing to pay a creator for allowing the brand to use their creative content. TikTok creators who are looking for a way to increase their income might be ready to create ads for brands.

On the other hand, it is entirely possible that the Branded Mission ads might fail. If the creators do the right thing, and disclose that this is an ad, it could make people decided not to watch it. Some people are going to reject that content specifically because it is yet another ad. In general, people tend to avoid ads as much as possible.


TikTok Reveals its State-Controlled Media Policy



TikTok posted information about its state-controlled media policy in a Newsroom post titled: “Bringing more context to content on TikTok”. Some social media companies already have put in place similar policies. Those who don’t have one will probably create one now.

Last year we began working to develop a consistent and comprehensive state media policy, as we recognize that an additional layer of context can be helpful to viewers, especially in times of war and in conflict zones. In response to the war in Ukraine, we’re expediting the rollout of our state media policy to bring viewers context to evaluate the content they consume on our platform…

TikTok will begin by applying labels to content from some state-controlled media accounts over the coming days.

Here are some key points from TikTok’s policies:

We recognize the heightened risk and impact of misleading information during a time of crisis. We continue to increase our safety and security measures and are working aggressively to help ensure people can express themselves and share their experiences, while we also seek to mitigate the potential for harm.

TikTok uses a combination of technology and people to protect their platform. Their teams speak more than 60 languages and dialects including Russian and Ukrainian.

TikTok reminds users that their Community Guidelines prohibit content that contains harmful misinformation, hateful behavior, or promotion of violence. The company will remove violative content, will ban accounts, and will suspend access to product features like livestream to those who break the rules.

TikTok also has evolved its methods in real-time to identify and combat harmful content, such as implementing additional measures to help detect and take action on livestreams that may broadcast unoriginal or misleading content.

TikTok will “remain focused on preventing, detecting, and deterring influence operations on our platform and our systems help us to identify, block and remove inauthentic accounts, engagement, or other associated activities on TikTok”.

The New York Times reported that some TikTok users were viewing videos of Ukrainian tanks taken from video games, as well as a soundtrack that was first uploaded to the app more than a year ago. Some who viewed that content believed they were seeing legitimate, authentic, videos posted by people in the Ukraine.