Category Archives: Privacy

Signal Rolls Out Usernames To Enhance User’s Privacy



For nearly a decade, cybersecurity professionals and privacy advocates have recommended the end-to-end encrypted communication’s app Signal as the gold standard for truly private digital communications. Using it however, has paradoxically required exposing one particular piece of private information to everyone you text or call: a phone number. Now, that’s changing, WIRED reported.

Today, Signal launched the rollout in beta of a long-awaited set of features it’s describing simply as “phone number privacy”. Those features, which WIRED has tested, are designed to allow users to communicate on the app and instead share a username as a less-sensitive method of connecting with one another.

Signal posted the following on their website:

Signal’s mission and sole focus is private communication. For years, Signal has kept your messages private, your profile information (like your name and profile photo) private, your contacts private, and your groups private — among much else. Now, we’re taking that one step further by making your phone number on Signal more private.

New default: Your phone number will no longer be visible to everyone in Signal

If you use Signal, your phone number will no longer be visible to everyone you chat with by default. People who have your number saved in their phone’s contacts will still see your phone number since they already know it.

Connect without sharing your phone number

If you don’t want to hand out your phone number to chat with someone on Signal, you can now create a unique username that you can use instead (you will still need a phone number to sign in for Signal). Note that a username is not the profile name that’s displayed in chats, it’s not a permanent handle, and not visible to the people you are chatting with in Signal. A username is simply a way to initiate contact on Signal without sharing your phone number.

Control who can find you on Signal by phone number

If you don’t want people to be able to find you by searching for your phone number on Signal, you can now enable a new, optional privacy setting. This means that unless people have your exact unique username, they won’t be able to start a conversation, or even know that you have a Signal account — even if they have your phone number.

Right now, these options are in beta, and will be rolling out to everyone in the coming weeks.

The Verge reported that if you do decide to create a username, it won’t appear on your profile details page or in your chats. Other users won’t be able to see it unless you share it. “Put another way someone will need to know your exact unique username in order to start a chat with you on Signal,” Randall Sarafa, the chief product officer at Signal, writes in a post announcing the features rollout.

In my opinion, privacy is very important for users of any social media website. Lack of privacy on any social media site can cause chaos and potential harm.


Tech Companies Under Pressure To Safeguard Data After Roe v. Wade Overturned



After the US supreme court overturned Roe v. Wade on Friday, calls increased for tech companies to take a stand about the use of online data to incriminate individuals seeking or providing abortion services, The Guardian reported.

According to The Guardian, abortion and civil rights advocates have warned that there are few federal regulations on what information is collected and retained by tech firms, making it easy for law enforcement officials to access incriminating data on location, internet searchers and communication history.

Electronic Frontier Foundation (EFF) posted “Digital Safety Tips: For People Seeking an Abortion”. In it, EFF recommends that people should keep searches related to abortion separate from their daily lives. Compartmentalizing it could make it harder to trace it back to you.

EFF also recommends choosing a separate browser with hardened privacy settings. They suggest browsers like Brave, Firefox, and DuckDuckGo on mobile. They also recommend turning off browser’s abilities to remember browsing history and site data/cookies. People who need to call a clinic or healthcare provider should do it through a Google Voice phone number instead of their actual phone number.

CNBC reported that tech companies may have to contend with issues about user privacy related to such health care whether they want to or not. That could be the case if they are ordered by a court to hand over certain types of data, like location information of users at an abortion clinic, search histories or text messages.

According to CNBC, before the decision [by the Supreme Court], lawmakers called on Google and the Federal Trade Commission to ensure data for online consumers seeking care would be protected in the event of a landmark Roe v. Wade ruling. The letters came in the wake of Politico’s reporting on a leaked draft decision that would cut back the protections.

ArsTechnica reported that four Democratic US Senators asked the Federal Trade Commission to “investigate Apple and Google for engaging in unfair and deceptive practices by enabling the collection and sale of hundreds of millions of mobile phone users’ personal data”.

The letter cited the Supreme Court decision overturning Roe v. Wade, saying that women “seeking abortions and other reproductive healthcare will become particularly vulnerable to privacy harms, including through the collection of their location data,”

Personally, I don’t think anyone wants to have their data collected by big tech companies – no matter what it was they were looking at on their phone or computer. Nobody wants data brokers to sell their information. I think that digital privacy is a topic we can all agree needs to be made more secure.


Clearview AI Settles Lawsuit Brought By ACLU



You’ve probably heard of Clearview AI, a company that unethically captured more than 10 billion “faceprints” from people’s online photos across the globe – without the consent do so. The American Civil Liberties Union (ACLU) filed a lawsuit against Clearview AI which recently resulted in a settlement.

The lawsuit was filed by the ACLU in Illinois state court in January of 2020 after The New York Times revealed that Clearview was building a secretive surveillance tool using biometric identifiers. Face recognition technology has helped Clearview to capture more than three billion faceprints, and counting, from images available online.

Illinois has a law called “Illinois Biometric Information Privacy Act” (BIPA), which was adopted in 2008 to ensure that Illinois residents would not have their biometric identifiers, including faceprint, captured and used without their knowledge and permission.

The groups represented by the ACLU in the lawsuit – including survivors of domestic violence and sexual assault, undocumented immigrants, communities of color, and members of other vulnerable communities, asked the court to order Clearview to delete faceprints gathered from Illinois residents without their consent and cease capturing new faceprints unless they comply with BIPA consent procedures.

The New York Times reported that Clearview AI agreed to settled the lawsuit brought by the ACLU. The settlement requires Clearview to not sell its database of what it said were more than 20 billion facial photos to most private individuals and businesses in the country. It can still sell that database to federal and state agencies.

The New York Times reported the following:

The agreement is the latest blow to the New York-based start-up, which built its facial recognition software by scraping photos from the web and popular sites, such as Facebook, LinkedIn, and Instagram. Clearview then sold its software to local police departments and government agencies, including the F.B.I. and Immigration and Customs Enforcement.

According to The New York Times, Clearview’s technology has been deemed illegal in Canada, Australia, and parts of Europe for violating privacy laws. Clearview also faces a provisional $22.6 million fine in Britain, as well as a 20 million-euro fine from Italy’s data protection agency.

Nathan Freed Wessler, a deputy director with the ACLU’s Speech, Privacy, and Technology Project, said in a statement to The New York Times: “Clearview can no longer treat people’s unique biometric identifiers as an unrestricted source of profits. Other companies would be wise to take note, and other states should follow Illinois lead in enacting strong biometric privacy laws.”

I find it extremely troubling that Clearview appeared to think it was acceptable to secretly gather photos of people’s faces – without their permission. It is even worse that Clearview was selling those photos in an effort to enhance its own profits.


Brave Browser Is Rolling Out a De-AMP Feature To Enhance Privacy



Brave announced that they are rolling out a new feature called De-AMP, which allows brave users to bypass Google-hosted AMP pages, and instead visit the content’s publisher directly. Brave states that AMP harms users’ privacy, security, and internet experience, and just as bad, AMP helps Google further monopolize and control the direction of the Web.

Brave will protect users from AMP in several ways. Where possible, De-AMP will rewrite links and URLs to prevent users from visiting AMP pages altogether. And in cases where that is not possible, Brave will watch as pages are being fetched and redirect users away from AMP pages before the page is even rendered, preventing AMP/Google code from being loaded and executed.

The Verge reported that Brave stated: “In practice, AMP is harmful to users and to the Web at large”. Brave also explained that AMP gives Google even more knowledge of users’ browsing habits, confuses users, and can often be slower than normal webpages. It also warned that the next version of AMP – so far called AMP 2.0 – will be even worse.

Brave pointed out why AMP is harmful:

AMP is harmful to privacy: AMP gives Google an even broader view of which pages people view on the Web, and how people interact with them. AMP encourages developers to more tightly integrate with Google servers and systems, and penalizes publishers with decreased search rankings and placements if they don’t, further allowing Google to track and profile users.

AMP is bad for security: By design, AMP confuses users about what site they’re interacting with. Users think they’re interacting with the publisher, when in actuality the user is still within Google’s control.

AMP furthers the monopolization of the Web: AMP encourages more of the Web to be served from Google’s servers, under Google’s control and arbitrary non-standards. It also allows Google to require pages to be built in ways that benefit Google’s advertising systems.

AMP is bad for performance and usability: Though Google touts AMP as better for performance, internally, Google knows that “AMP only improves the ‘median of performance’ and AMP pages can actually load slower than other publisher speed optimization techniques”.

The Verge explained that AMP was controversial from the beginning and smelled to some like Google trying to exert even more control over the web. Over time, more companies and users grew concerned about that control and chafed at the idea that Google would prioritize AMP pages in search results.

DuckDuckGo tweeted: “NEW: our apps & extensions now protect against AMP tracking. When you load or share a Google AMP page anywhere from DuckDuckGo apps (iOS/Android/Mac) or extensions (Firefox/Chrome) the original publisher’s webpage will be used in place of the Google AMP version.”

Personally, I think that the more privacy online, the better the internet will be for all of us. It is great that Brave and DuckDuckGo are offering people simple solutions to prevent Google from tracking them all over the web. It is very sketchy of Google to trick users into thinking they are on the website they searched for – but swapping it with an AMP page.


DuckDuckGo Introduces Beta Launch of DuckDuckGo for Mac



DuckDuckGo announced the beta launch of DuckDuckGo for Mac. Like their mobile app, DuckDuckGo for Mac is an all-in-one privacy solution for everyday browsing with no complicated settings, just a seamless experience. DuckDuckGo for Windows is coming soon.

Using an app designed to protect your privacy by default not only reduces invasive tracking, it also speeds up browsing and eliminates many everyday annoyances like cookie consent pop-ups.

DuckDuckGo for Mac gives you privacy by default. With one download you get their built-in private search engine, powerful tracker blocker, “new” cookie pop-up protection on approximately 50% of sites (with that % growing significantly throughout beta), Fire Button (one-click data clearing), email protection and more – all for free. No complicated privacy settings, just simple privacy protection that works by default.

DuckDuckGo for Mac is really fast! By using your computer’s built-in website rendering engine (the same one Safari uses), and by blocking trackers before they load (unlike all the major browsers), you’ll get really fast browsing. It is already faster than Chrome on some graphics performance (using the Motion Mark 1.2 benchmark) and as an additional benefit, by blocking trackers, DuckDuckGo uses about 60% less data than Chrome.

DuckDuckGo for Mac is built for security. Their built-in Smarter Encryption ensures you navigate to the encrypted (HTTPS) version of a website more often, and their tracking blocker means less exposure to third-party scripts that could try to access your data. And they design their product so that all in-app data, like history, bookmarks, and passwords, by default are only stored locally on your device and aren’t accessible to DuckDuckGo.

The beta of DuckDuckGo for Mac has a waitlist. The company is letting new people off the waitlist, and says the sooner you join, the sooner you’ll get it. You won’t need to share any personal information to join. Instead you’ll secure your place in line with a date and time that exists solely on your device. DuckDuckGo will notify you when they are ready for you to join the beta.

Right now, the beta is Mac only. The Windows beta is coming. DuckDuckGo recommends Windows users follow them on Twitter for updates.

Personally, I would like to check out DuckDuckGo for Mac after it launches. Im not a fan of having my data taken without my permission by websites that want to use it for their own financial benefit.

Doing that is very likely illegal, because I live in California which instituted the California Privacy Rights Act (CPRA) in 2020. There have been countless times when a website intentionally makes it extremely difficult to shut off their tracking and cookies. I would love to see DuckDuckGo for Mac shut all that garbage off for me!


WhatsApp Lets Users Control How Long Messages Stick Around



WhatsApp announced that they are providing users with more options to control their messages and how long they stick around, with default disappearing messages and multiple durations. This makes sense considering that WhatsApp introduced disappearing messages last year, and a way for photos and videos to immediately disappear after being viewed once.

WhatsApp users will now have the option to turn on disappearing messages by default for all new chats. When enabled, all new one-on-one chats you or another person start will be set to disappear at your chosen duration, and we’ve added a new option when creating a group chat that lets you turn it on for groups you create. This new feature does not change or delete any of your existing chats.

Two new durations for disappearing messages have been added: 24 hours and 90 days, as well as the existing option of 7 days. If you want to enable the new durations, or change the ones you have, go to Privacy and select “Default Message Timer”.

Based on WhatsApp’s blog post, it appears these decisions may have been influenced by the pandemic. The company wrote, “Living apart from family and friends for over a year has made it clearer than ever that just because we can’t physically talk in person, it doesn’t mean we should have to sacrifice the privacy of our personal conversations.”

The WhatsApp blog also states: “We believe disappearing messages along with end-to-end encryption are two crucial features that define what it means to be a private messaging service today, and bring us one step closer to the feeling of an in-personal conversation.”

The Verge pointed out that WhatsApp is owned by Meta (along with Facebook and Instagram). If you don’t currently use Facebook products, you might want to consider if you trust Meta enough to do the right thing with your data.


Google’s FLoC is Unpopular with Other Browser Creators



Google introduced a new piece of technology called “Federated Learning Cohorts” (FLoC). According to Google, FLoC “protects your privacy” because it “allows you to remain anonymous as you browse across websites and also improves privacy by allowing publishers to present relevant ads to large groups (called Cohorts)”.

EFF has launched “Am I FloCed?” It is a new site that will tell you whether your Chrome browser has been “turned into a guinea pig for Federated Learning of Cohorts or FLoC, Google’s latest targeted advertising experiment.”

Google’s FloC is unpopular with other browser creators. Brave posted a blog titled: “Why Brave Disables FLoC”:

“Brave opposes FloC, along with any other feature designed to share information about you and your interests without your fully informed consent. To protect Brave users, Brave has removed FLoC in the Nightly version of both Brave for desktop and Android. The privacy-affecting aspects of FLoC have never been enabled in Brave releases; the additional implementation details of FLoC will be removed from all Brave releases with this week’s stable release. Brave is also disabling FLoC on our websites, to protect Chrome users learning about Brave.”

Vivaldi posted a blog post titled: “No, Google! Vivaldi users will not get FLoC’ed.” In the blog post, Vivaldi makes it clear it does not support FLoC, which they call “a privacy-invasive tracking technology”. From the blog post:

“The FLoC experiment does not work in Vivaldi. It relies on some hidden settings that are not enabled in Vivaldi… Although Vivaldi uses the Chromium engine, we modify the engine in many ways to keep the good parts but make it safe for users; we do not allow Vivaldi to make that sort of call to Google.”

DuckDuckGo posted a blog in which it points out that you can use the DuckDuckGo Chrome extension to block FLoC’s tracking.

Mozilla gave The Verge a statement that included: “We are currently evaluating many of the privacy preserving advertising proposals, including those put forward by Google, but have no current plans to implement any of them at this time. We don’t buy into the assumption that the industry needs billions of data points about people, that are collected and shared without their understanding, to serve relevant advertising.”

Opera gave The Verge a statement that included: “While we and other browsers are discussing new and better privacy-preserving advertising alternatives to cookies including FLoC, we have no current plans to enable features like this in the Opera browsers in their current form”.

The fact that so many browser creators have decided against enabling Google’s FLoC is significant. It means that FLoC is really bad for users, and that Google should not impose it upon people who use Chrome.