Fireside Family Crypto Chats

A rough synopsis of what transpired:

We got caught up on news events around privacy and crypto:

1. How did the FBI get Paul Manaforts encrypted What’s App messages? -> ie dont backup your messages to the cloud and make sure you delete them. Better yet use something like Signal with timed messages that disappear after a day or a minute or something else short.

2. A bit of talk about the GDPR and the ambiguity of who qualifies as a data subject ( under its protection

Then we made a huge list of topics that would be good to unpack and talk about with your family:

Passwords / Password Managers
Internet-enabled devices in the home (i.e. Alexa)
Spying on kids aka kid location trackers and other software for child monitoring
Geolocation of photos and phones
DNA services like 23andme / <>
Cloud services
Ways to transmit important information back and forth between family members
Social media practices of family members (checking into geolocations, tagging in photos, privacy settings)
Phishing and safely opening emails / browsing the web
The discussion around “Why Privacy Matters” and how approaching it with family may necessitate a different/more personal approach than strategies developed for other audiences

The group first focused on DNA privacy. People shared stories of family members using these DNA reading services and websites to track their geneology and in some cases use social media features of these services to locate/message newly discovered relatives. There were a few privacy issues discussed:

1. because your DNA is shared with your biological relatives, when one person decides to use a service they are effectively making that decision for the rest of the family, lots of times without their consent
2. most of the for profit DNA analysis companies are not HIPAA-compliant and have very broad Terms of Service which effectively allow them to resell your DNA profiles to other companies such as insurance companies or drug companies
3. the companies also make it easy for law enforcement to work with them, making it easier to identity or misidentify people based on their DNA (these are not government databases like police fingerprint databases, but they are effectively becoming them because they can be easily accessed by law enforcement) -> new article on Science Mag about this
4. we also discussed how bad the validity of the analysis is that they are doing. someone realized that no one has ever heard a story of a sample coming back from a company telling a person that their spit wasn’t good enough to derive DNA from, these companies are just shuttling through samples without effective data cleanliness and sometimes ulterior motives (example of abuse of DNA testing in Canada to give false positives of indigeneity) – this can easily result in questionably valid results for individuals which can be exploited or sold or used by law enforcement leading to other errors

We then discussed spy tools used on kids. The group came to the conclusion that like most parenting, it should be a discussion with their family as to what technologies they are going to choose to use together. One participant said their family had decided to never enable any geolocation tracking for their family members including teens, it was a difficult decision, but they wanted to give their kids the ability to have privacy. The other part of this discussion was getting kids and really any family members to understand public vs private and guide them to making good decisions about what they post where. One participant had their daughter in the 5th grade create a private instagram and a public instagram account, and each photo they posted they would make the decision whether it should be private or public. This seems like a good technique for teaching anyone about privacy. A point was also made how schools can be complicit in collecting personal data and selling it to third parties, such as in the case of the PSATs (

We talked about how to send private information back and forth within the family. It was decided that it depends largely on the level of security of the information, so that emailing a throwaway password for a shared Netflix account might not matter, but sending a social security password or password for an email address would be really bad. It was suggested that telling passwords over the phone or sending them over a secure messenger was much better than email. Many password managers also have family features that allow you to keep and share passwords with family members, which has the added benefit of facilitating access to family member’s passwords in the event of needing to provide them tech support. We also discussed Signal and some other secure messengers as options, but recognized even despite the user friendliness of Signal it was still sometimes a challenge to convince family members to use a new platform.

Finally, we talked about why privacy matters and strategies to engage family members in particular. A recurring strategy was to ask pointed questions to family members about what they understand about the technology they use in order to help them make decisions for themselves. One way to do this is using modes of learning they already work well with, such as getting someone to read the book Data and Goliath by Bruce Schneier –

If you’d like to receive updates from any of the other orgs involved in this event, sign-up pages are here:
LA Cryptoparty:
CRASH Space:!forum/crashspace

P.S. This article has come out since Wednesday but it’s relevant in terms of potential abuse of IoT/spying devices within the domestic setting: