Signal Warning?

what are the mobliecoin stock options retirement package?

From It's Going Down

Why Moxie’s Departure is Not the End of Signal

Technical discourse almost always generates an odd sort of distortion when it collides with life outside of disciplinary separations. This is very much the case when it comes to radicals and information security. Communications, computer science and cryptography are highly complex, to the point that even people working in the security industry full time have difficulty grasping the complexities from time to time. As we saw with the Snowden leaks, the combination of highly complex technical content and the possibility of danger in the form of surveillance, tended to generate a discourse grounded in hyperbole and conspiracizing. This approach often led people to either attempt to completely go dark, or to modify their practices base on false, misconstrued or misunderstood information.

Over the past series of years this phenomenon has emerged in relation to Signal and Tor, very specifically. Without getting too into the technical elements of each, we can definitively say that the conspiracies about weak encryption, protocols being “broken,” secret government backdoors and so on are commonly held, but ultimately damaging and false arguments. This tendency is gaining traction and accelerating in relation to Moxie Marlinspike leaving Signal.

For those of you that are aware of the history of Signal, it was developed by Moxie and others, with many of the early adopters being within the anarchist milieu. The adoption of Signal is directly a result of Moxie’s long term connections to anarchist communities and the sheer strength of the cryptograhic model. These two elements are fused in the minds of many, and the requirement of trust for cryptographic systems is essential, but these concerns need not be inherently joined together, and a shift in an organizational posture does not necessarily mean a shift in technical systems.

So, when we are thinking through Signal, and whether it is secure even if someone that many identify as “one of us” is no longer the one making day to day decisions (Moxie will remain on the board of directors), we have to separate these organizational and technical considerations. On a purely organizational level Moxie leaving Signal definitely severs a connection between that project and the anarchist milieu in general. After this shift it is no longer the case that we can have this clear sense that everyone involved in the project is “on our side.” Understandably this has generated a significant level of concern. In order to address this concern it is important for us to discuss both the technical elements of integrity and security, at a high level, as well as things that we should be looking for as far as changes to the project that could be concerning.

On a purely technical level, without getting into the complexities of ratcheting key exchanges and their role in Signal, the absence of Moxie does not lead to a change in the code base. We have to keep two things in mind on a technical level. Firstly, the cryptographic structures used in Signal have been tested, audited, hacked away at, and have been found to be incredibly resistant to decryption. It is important to keep in mind that there is no such thing as unbreakable encryption (part of why good opsec is still necessary), but on a practical level the cryptographic structures within the Signal protocol are resistant enough to make the attempt to decrypt these messages impractical, regardless of the amount of computing power one could mobilize. At the end of the day cryptography is math, and mathematical realities still manifest in the same ways, regardless of the author of the code.

Secondly, and this is incredibly important, the project itself is open source. Just like Tor, or any other encryption centric system worth considering, Signal is a project that publishes it’s source code, allowing anyone to look at the code, modify it for their own use, research its internal workings and so on. This means that any changes to this code base would be noticed by researchers who already track these changes. The code for Signal, as well as its runtime, are constantly being audited by professionals, many of which are sympathetic to our objectives as a community. Often the results of this research is published openly, and Signal has a wonderful history, specifically in the recent past, of disclosing reported vulnerabilities, describing why those vulnerabilities exist and what they have done to patch these issues. As such, there is not really any way to “sneak” some sort of backdoor, weakened encryption algorithm or other malicious content into the code without that being immediately noticed.

But, even though the technical elements of Signal are strong today, the central concern seems to be about whether it can be trusted tomorrow, and this is an organizational question and a question of trust. Signal benefited from an organic chain of trust that vouched for those involved in the project in its early days. With Moxie departing, that chain of trust is broken. Though many have spoken highly of Brian Acton, who is interim CEO and one of the founders of WhatsApp, and his dedication to privacy, needless to say, he is not someone that comes from our communities, and as a result we cannot place the same trust in him, or whoever comes after. To ensure the integrity of the project, without needing to dive into thousands of lines of highly technical code, a series of indicators can be monitored to identify changes in security posture or mentality.

The first is feature adoption. So far Signal has done an amazing job of heavily researching, and at times figuring out completely new approaches to, the implementation of features. This research gets encompassed in blog posts that they release around updates. A really good example of this approach can be found here.

If we start to notice that these updates are not being issued, or if we start to notice a bunch of features being implemented rapidly, without the time to research their security implications, that is a potential cause for concern, and would indicate a shift in internal process.

Secondly, we need to be listening to security researchers as they do assessments of the platform and code base. The information security community is filled with highly intelligent, skilled and dedicated researchers who are trying to find ways to use technology to further movements for liberation. As a result, apps like Signal are heavily audited, with findings often publicly available. This allows us a glimpse into issues with Signal and any patterns that may develop which would indicate underlying issues, poor coding practices or removals/weakening of security features.

Finally, Signal has a pretty good history of being transparent and responsive to reports about security issues. There are a lot of conspiracy theories out there about Signal, and plenty of poorly informed “hot-takes” being issued from unreliable sources. When issues do arise that can be validated by legitimate researchers, Signal often reports it immediately, does so very publicly and often issues a patch soon afterward. We can see this in the amazing post they wrote about defeating the Cellebrite UFED, which can be found here.

If they stop reporting vulnerabilities, the security community will get very up in arms about it, and I will definitely hear about it. If Signal stops being proactive about these questions, and if the security community needs to hold them to account, that would be a concern.

All this is to say that we need to, as with all of our analysis, depart from a place of information and the evaluation of information, and not partial information, hyperbole or emotion; this is especially the case with technical systems. There is a lot to say about the subject of intentionality and the programming of machines. For now, however, it is enough to simply say that computers do not have emotions; they are silicone, metal and fiberglass, and like all inanimate objects, it cannot have intent on its own (the intent and socialization of developers is a different question).

Therefore, on the level of encryption and the use of encryption we have to fundamentally analyze our operational security along two converging lines. On one level, there are the technical elements of the system, what it can do, what it can’t do and how it does it. This can be complicated to understand, especially with something like Signal, but there are plenty of resources written for people that are non-technical or less-technical explaining these processes (the Electronic Frontier Foundation has some good resources on this). On another level, we have to really develop an understanding of what problem any specific tool is meant to solve, or what risks it is meant to mitigate.

The reality that Signal is no longer something that some of us feel politically connected to does not also imply that its security is compromised or will be in the future. The removal of Signal from organic chains of trust merely repositions our relationship to the system. Just like any third party app, Signal no longer benefits from being something uniquely trusted for political and social reasons, and now must be analyzed for what it is, a really solid system with mathematically verifiable encryption that is sufficiently strong, and a backend that does not keep meaningful metadata about users and usage.

On some level this is not a bad thing. Now, with the veneer lifted, our ability to analyze Signal, and to evaluate its usage within our contexts can begin to occur outside of any distortions that trust can sometimes generate. Now we have to look at the app and its underlying protocol as they are, as code running within a computer, with all of the benefits and limitations that this entails. This is far from the end, and is not even, at this point, even moving in that direction. But, like all technical systems we need to approach them with information and suspicion.

These sorts of scenarios, and the fervor that they generate, underscores a core lesson; all tools that are run by third parties need to be approached with suspicion. Systems have vulnerabilities, service providers are sometimes untruthful about what data they are collecting, email providers scan our messages, there are a thousand risks, and none of them can ever be said to be definitively dealt with; this applies just in the same way to every third party system, whether it is Signal or Gmail. No tool is perfect, no tool can prevent every risk or provide every capability. Even with powerful tools, like Signal, the fundamentals of operational security always must be kept in mind; in other words, even if its encrypted, don’t say illegal things on electronic devices.  As the world changes, and as the dynamics of revolt and repression evolve, the risks we face and the tools we use must evolve accordingly. But, this evolution always needs to be carried out with a clear eye toward the use and limitations of any tool we use, whether that be a hammer or Signal.

Resources

https://ssd.eff.org/

https://www.privacytools.io/

https://freedom.press/training/

https://ccrjustice.org/if-agent-knocks-resource

https://cldc.org/security/

photo: Markus Spiske via Unsplash

There are 9 Comments

The addition of cryptocurrency transfer has caused many to speculate that banking regulators migjt get involved, wantimg real names for those transactions("know your customer"). This would be in the open and blatently obvous. If Signal was ordered ro push a public update requiring ID verification (accounrs considered bank account) it is probable most users would uninstall it instead and Signal would then die out. Regulators would probably hear this prediction in testimony of course.

Presumably Signal could dump crypto currency handling to get out of banking-related regulations. A ham-handed judge however could order a server shutdown, or some other.order that causes Signal to shut.down rather than attempt to comply.If Signal gets.accused of "money laundring" and government then realizes covert surveillance is impossible, Signal's servers could be shut down like Backpage was. Both court orders and legislation are always risks Dependng on any one nation state not to change their laws is risky, and doubly so when that government is unstable as rhe US now is.

Two ways to counter.that would.exist: One is Signal themselves could exit the US and set up shop elsewhere. The US kiled Backpage over sex work ads, but sex workers still have online sites based outside the US but used inside the US. Signal or a fork of it could work the same.way.

One is that with rhe code being open, anyone could set up a new server, and would not be subject to court orders against another organizarion. If they strip out currency transfer features banking regulators are.out of the picture.

If the US leverages lurid money launderimg.or human traffickimg stories to get a broad ban om end to end encryption, this would again require a nin-US server. Again, the response to FOSTA/SESTA is a template.

Forked versions could even be used wirh a lovsl server run legally or otherwise for the duration of a single campaign. A server raid would provide zero insight into the content of the messages and if all the phones were burners no other useful info either.

If the cause of this is a broad regulation also affecting Telegram and FB-owned Whatspp, it is likely the outside the US fork would rapidly replace the US-based Signal, just as Protonmail has replaced Hushmail for actual secure email handling.

Note that Protonmail, Tutanots,and Hushmail all would be illegal provide from a server inside the US due to a George Bush era ban on encrypted email. That of course has no effecr on US users of "foreign" encrypted email.

Good article that summarizes my feelings on the matter. Regarding the first comment expressing concerns about the cryptocurrency (mobilecoin) integration, I understand your concerns and have many of them as well. My feeling is that if Moxie, Signal, etc. ever thought they'd be forced to implement additional "Know Your Customer" things, they never would've put the work in.

The KYC regulations coming to more and more crypto operations are a joke particularly because they undermine a key reason people are using a cryptocurrency: privacy and/or anonymity. Using a bitcoin atm to pay for something is incredibly more expensive and obnoxious than literally any other form of payment. You're doing it because on some level you're at least a bit sketchy and don't want your transaction being monitored in the same way it is with a credit card. The more cryptocurrency purveyors deny this, the more they're denying who their customer case really are: criminals and general sketchbags. Other than security reasons, there is no logical reason anyone would want to use a cryptocurrency to pay for anything. It is stupid, expensive, and obnoxious, period. If you're willing to put up with it you must have a compelling reason to and that compelling reason is completely incompatible with KYC practices.

All this to say that in theory, the cryptocurrencies that are acknowledging that fact and building upon that with privacy/anonymity chiefly in mind (monero, zcash, and now mobilecoin, the signal cryptocurrency which is heavily inspired by monero), if widely adopted, could be doing a great thing for the struggle of turning the internet "dark", the FBI's worst nightmare and our fantasy. Encrypted communications are becoming ubiquitous, widely adopted, and easy to use. There have been multiple very real investigations that have been hampered by Signal. Signal clearly knows how to make a user-friendly experience for encrypted messaging. If they can do the same thing for payments, I'm cautiously optimistic. I don't want only sketchbags using hard-to-use technology to conceal their bad (good) behavior when paying for things. I would love for a huge segment of digital payments to go dark. That would be fantastic for fucking up so many law enforcement operations and I think Signal is making the gamble that they're now big enough (at least 100 million users last I heard) that they can just like with messaging, get tons of regular people to actually use secure payments and again, just like with messaging, provide crowd cover for those who actually need that privacy protection. Because in the end, we know the NSA doesn't give a fuck about your dick pics, but if you can be convinced your dick pics need greater security, well that's great because you're providing cover for those who really do need protection.

Also, regarding the first comment, I'm not aware of any "Bush-era ban on encrypted email". The US has no laws prohibiting use of encryption for anything, despite various statements made by the FBI wanting such prohibitions or legally mandated backdoors. Their cries for help have so far largely fallen on deaf ears.

Somewhere around 2005-2006, the former encrypted email provider Ziplip shut down over some change of US laws or regulations that had the effect of banning their encrypted email service. Years later, encrypted email provider Lavabit defied a court order to create a backdoor against the Snowden account, shutting down and destroying all data rather than cooperating. They risked jail time to do this, correctly guessing that the public furor over the Snowden revelations would deter prosecution.

The US does have regulations about telecommunications providers. While it is legal to run "information services" that government cannot access, the same is not true of say, the phone company. There is CALEA which requires phone companies to build law enforcement access into voice phone communications and similar things. A phone company could not legally operate Signal or Protonmail themselves.

Twitter was threatened with thousands of dollars in fines when they initially refused to let the NSA run their PRISM software on Twitter servers against Twitter users. There is a good reason NONE of the current widely known encrypted email services are based in the US. There is a good chance encrypted text services will have to follow suit, that risk is raised by the attention cryptocurrency transfer could attract.

Note that locally encrypted text using PGP and sent over any text or email service would be very difficult to ban, as would be a peer-to-peer encrypted messaging system with no central server. Unless the government had to stones to prosecute users they could not stop it. Routine use of https means the ability to order telecoms to drop all packets they cannot inspect has been lost. Even if the NSA can crack https, the telcos cannot. Downside of the first approach is the government can work out who talks to who, which is as sensitive as anything in conspiracy cases

In the UK, the government responded to SHAC's use of locally encrypted email by getting a key disclosure order from the courts. SHAC defied this order and got away with it, again prosecution was deemed too politically expensive. We do not have key disclosure orders in the US, many attorneys believe it might be a 5th Amendment violation based on a couple of court cases from the previous decade. Probably the US government is stuck when it comes to use of local encryption, legal or otherwise.

1- "We're still working on making Signal independent from phone numbers" (as per 2020... who knows whatever happened with that...)

2- "You mean the Google Play Store? Ah yeah, that one.. *silence on the line*."

3- "Huh... You believe in federated services are the thing of the future? I don't. Bye."

(by a very friendly and reliable anarchist who's well-connected with some anarchist milieu, somewhere)

For several years Signal has worked just fine without Google Play services or the store. Signal devs put a lot of work into that as it became obvious that Signal was popular with folks who could never safely trust Google.

You can get it from Signal's own websire and then verify the signature and package contents with jarsigner. Also the Aurora store (installable from f-droid) is an unoffical client for Google's Play Store and can pull any non-paid package from there wirhout a Google Account.

If you have Google Play services imstalled/not disabled Signal will use ir. If not it will still run at the price of a bit more battery use.

The Jan 6 investigation shows how dangerous Google is. Cell tower records are only accurate enough to prove a phone was somewhere on Capitol Hill, no good for indictments. Google high accuracy location with location sharing on (the default) can tell inside the Capitol from outside by using known wifi and bluetoorh signals together with GPS. The FBI got a warrant against Google for every phone w location sharing on inside the Capitol on J6,excluded those expected ro be there,and put in for indictments and warrants against owners of all the others.

Any fash on J6 with Google Maps active while logged into a Google account who did not actively turn location sharing off was toast. Defenses.were in order of strength location sharing off, location off, not logged in, no Google Account to log into, Google Maps diaabled, Google Play Services disabled,airplane mode, phone left at home.

None of us should ever have a phone logged into a Google account after this. Thus, we should be using Signal or whatever replaces it without getting it from the official Google Play store client at least. Knowing Google we should also toss out Gapps, turn off "administrator apps" like "find my phone" and then we can disable Google Play Services. If your phone won't let you disable Google Play Servicea or the Facebook app, remove them both along with Google Maps over USB by the ADB (android debugging) interface. Finally install OsMand (OpenStreetMap clienr) if you need a navigation app. Its open source and much more trusted.Note that Google has been busted foe using Google Maps ro steal location data from folks wirh both location abd location sharing turned off, and facebook's app has been caught "restoring" privacy settings to default, contacts sharing included. Neither has any place on an activist phone.

Add new comment