Killing Facebook softly – and asking how can I sell them my data instead?

Analogue-columns

This year I've been trying to form better habits through what I jokingly refer to as my 'Analogue Columns Lifestyle Planner Tool' – basically a daily set of columns in a notebook for ticking off stuff I want to do more/less of. Most things have been going pretty well – apart from the digital detox column. It was just way too easy to ignore Screentime warning limits on my Facebook, Twitter and Instagram apps.

Until this week – I had a breakthrough. A moment of empowerment and action that has been a long time coming.

It happened after reading about how (yet another) 540 million Facebook records were left exposed due to sloppy third-party developer security and then listening to Grahan Cluley's Smashing Security #75: ‘Quitting Facebook’ podcast (on the same link).

One of the lines in it was about how we all hate being on Facebook but can't quit it because of FOMO (fear of missing out). Who wants to be in that resentment-filled situation of not being able to leave?

But it's so true. My own FB addiction is based on social glue, comment witticisms, creating diary-style timehop memories and social calendar notifications (and some work stuff). Which is fine but it also sucks me in beyond this as I scroll the newsfeed and feel compelled to comment or click 'Like' a few times a day (hundreds or thousands of interactions a year), each taking a moment that also adds up.

In fact, according to my Screentime stats, the whole scroll and respond is adding up to an astonishing eight days a year, and that's just on my Facebook phone app. If I add in all my other phone interactions, including calls and text messages, it multiples to around six weeks a year. SIX WEEKS!

And, the thing that galls, of course, is that this time and information has value. Ultimately I am the product for FB. Each interaction is building up my profile for FB advertisers and I have now handed over more than a decade of details about my life and thoughts for FB to sell on. I think we all deserve a cut of these ad profits. The free service is no longer a fair price for users.

So I've done three things to contain Facebook without losing out on the things I like. They took a few minutes to do but should make a massive difference.

  1. I went through all the privacy settings, disabling Facebook's app platform and turning off all the default 'on' stuff.
  2. I deleted my phone apps for Twitter and Facebook. This was the big one. Firstly, a lot more tracking can happen through phone apps. Secondly, the phone is always there in my pocket and is just too easy a temptation, like how biscuits start calling when you pour a tea.
  3. I moved Facebook on to Firefox browser and put it in an extension called Facebook Container to essentially neutralise its data hoovering powers. The extension "isolates your identity into a separate container and makes it harder for Facebook to track your activity on other websites via third-party cookies". Mozilla does not collect data from this – it only knows the number of times the extension is installed or removed. So all those Like buttons on pages around the internet won't now track my browsing, which feels very freeing.

There are other tricks – Chrome users can get their FB feed replaced by an inspirational quote, for example. But I'm kind of interested to see how my newsfeed changes based on my detox. And I like to see friend's news. And I don't want to use Google's Chrome.

So how about the monetary aspect of getting some of the financial value of my data for myself?

After being part of The Glass Room in London in 2017, I've been intrigued by the idea of selling my data to advertisers.

This is not a new idea but this week I've been inspired by Jean Guerrero's article in Wired on how maybe we are all targeted individuals and what this is leading to.

Towards the end of this long read, she talks about Jaron Lanier's idea that:

"we should demand payments for data that companies collect from us. He envisions a world in which we are compensated for every profitable datum we provide, with payments proportionate to the profit they produce. He argues that such a world—in which we value the humans behind data as much as the data themselves—would lead to a new era of economic prosperity, equality, and freedom."

Who couldn't do with some of that?

At the moment it's only an option to restrict access or quit the platform. A container for my personal data and Facebook interactions seems a good first step towards pushing back and sends a message to FB (which still holds all my past data, even if it is less useful from now on).

But why not offer us the option to be both the product and the customer?

I guess because we all give ourselves too freely for Facebook to offer us any other deal; we don't value our data and often have a 'who cares?' attitude to posting on these services.

But if users could have customisable permissions and data access for a price, what then? I would love to reverse the Facebook business model and target advertisers willing to pay for access to my demographic. Might this not be something FB could monetise, too? After all, they are a data broker – and surely that can work both ways.


Hire/commission me: fiona [at] fionacullinan.com


On becoming a Glass Room Ingenius

I RARELY LOOK at email newsletters, even the ones I've subscribed to, but in September I opened 'In The Loop' from a Berlin technology collective called Tactical Tech, and inside was a dream opportunity to build on work begun during my sabbatical.

BE AN INGENIUS FOR THE GLASS ROOM LONDON
The Ingenius is the glue that holds The Glass Room together. We're recruiting individuals who we can train up with tech, privacy and data skills in order to support The Glass Room exhibition (coming to London in October 2017). As an Ingenius you'd receive four days of training before carrying out a series of shifts in The Glass Room where you'd be on hand to answer questions, give advice, run workshops, and get people excited about digital security.

Having spent the first eight months of 2017 studying cybersecurity and cleaning up my own online practices, I had started offering free help sessions in our local café. Engagement was poor – it turns out that free infosec sessions aren't in demand because busy people tend to put these things on the backburner and just hope they don't get hacked in the meantime.

Francis Clarke, who co-runs the Birmingham Open Rights Group which campaigns around citizens' digital rights, warned me that topics like infosec and data privacy were a hard sell. Friends and family confirmed it with 'I don't care if I get sent a few contextual ads' or 'I have nothing to hide'.

So how do you get people to become aware and start to care about their online practices?

Answer: The Glass Room.

***

The Glass Room – presented by Mozilla and curated by Tactical Tech – in every way resembles a bright, shiny tech store inviting passers-by in to check out its wares. Yet another shop on a busy London street. But the items on show are not gadgets but exhibits that help people look into their online lives and think more critically about their interactions with everyday digital services.

To be honest, I mostly saw The Glass Room as providing a readymade audience who were up for talking about this stuff because talking would enable me to get everything I'd been learning out of my head and also level up on my own understanding of the issues.

I didn't think I would stand a chance of being selected but I applied anyway. I've listed some of the questions from the application and my (short version) answers for a bit more context on why I started on this journey – otherwise feel free to skip ahead.

Why are you interested in becoming an Ingenius? (provide 3 reasons)

Individually – I was blown away by Edward Snowden’s revelations and the Citizenfour documentary. I have been data detoxing and self-training in infosec, and I'm very interested in the engagement tools and workshop resources.

Locally – I'm involved in several campaigns. I want to help individuals and campaigners know how to keep their data and communications private and secure.

Nationally/internationally – I'm concerned with the normalisation of surveillance (both governmental and commercial) and how the line is constantly being redrawn in their favour. I would like to understand more about the politics of data and how to think about it more equitably in terms of the trade-offs concerned with policing, sensitive data sharing, commercial data capture and the individual right to privacy.

What do you think about the current state of privacy online?

I have concerns both about privacy clampdowns by governments and mass surveillance by commerce. I love the internet but find the fact that I have to jump through so many hoops to avoid being tracked or identified worrying. I feel I am part of some subversive resistance just to have control of my own data and this is intensifying as I have a writing project that I want to keep anonymous (almost impossible I since have discovered).  I'm also concerned that enacting the paths to anonymity may flag me on a list and that this may be used against me at some future point, especially if there is no context in the data.

I think our right to privacy is disappearing and the biggest issue is getting people to care enough to even talk about that. We seem to be giving up our privacy willingly because of a lack of digital literacy about how our information is being used, the dominance of data brokers such as Google and Facebook (for whom we are the product), the lack of transparency about how algorithms are processing our data, and so on. The issue feels buried and those who control information too powerful to stop.

How would you take the experience and learning as an Ingenius forward?

I’ll be taking it into my local community through advice surgeries in cafés and libraries. There seems to be little privacy/security support for individuals, activists, campaigners and small businesses. I also hope it will give me the wider knowledge to become more involved with Birmingham Open Rights group, which operates at a more political level.

Finally, I aim to connect more widely online around these topics and investigate options for setting up something to help people in Birmingham if I can find suitable collaborators.

***

I'M IN!

This is one of those things that will completely take me out of my comfort zone but will also likely be one of the best things ever.

***

THE GLASS ROOM when it ran in New York City saw 10,000 come through the doors. In London, on the busy Charing Cross Road, just up from Leicester Square, the figure was close to 20,000.

I was fretting  about all sorts of things before my first shift, mostly about standing on my feet and talking to people all day – normally I sit at a desk and say nothing for eight hours that isn't typed. I was also nervous that despite the excellent four days of Glass Room training, I wouldn't know enough to answer all the random questions of 'the general public', who might be anything from shy to panicked to supertechy.

But it was fine. More than fine, it was exhilarating, like the opening night of a show you've been rehearsing for weeks. If anything, I had to dial it back so that visitors would have a chance to figure things out for themselves. The team were lovely and the other Ingeniuses supportive and funny. Most importantly, the visiting public loved it, with 100-strong queues to get in during the final weekend of the exhibition.

It must be a complete rarity for people to want to come in, peruse and engage with items about wireless signals, data capture and metadata. But by materialising the invisible, people were able to socialise around the physical objects and ask questions about the issues that might affect them, or about the way big data and AI is affecting human society.

Day after day, people wandered in off the street and began playing with the interactive items in particular: facial recognition to find their online lookalikes, nine volumes of leaked passwords to find their password, newsfeed scanning to find the value of their data, the stinky Smell Dating exhibit to find out who they were attracted to from the raw exposed data of three-day-old T-shirts (c'mon people – add some metaphorical deodorant to your online interactions!).

They also spent time tuning into the trailers for highly  surveillant products and brands, and watching an actor reading Amazon Kindle's terms and conditions (just under nine hours, even in the bath).

And they gathered en masse around the table-sized visualisations of Google's vast Alphabet Empire that goes way beyond a search engine, Amazon's future Hive factory run mostly by drones and other robots, Microsoft's side investment into remote-controlled fertility chips, Apple's 3D pie charts of turnover and tax avoided, and Facebook founder Mark Zuckerberg's House where you can buy total privacy for just $30 million.

***

THERE WERE THREE themed areas to explore inside The Glass Room, with three further spaces to go deeper and find out more:

  1. Something to hide – understanding the value of your data and also what you are not hiding.
  2. We know you – showing what the big five of GAFAM (Google, Amazon, Facebook, Apple and Microsoft) are doing with the billions they make from your online interactions with them.
  3. Big mother – when technology decides to solve society's problems (helping refugees, spotting illegal immigrants, health sensors for the elderly, DNA analysis to discover your roots), the effect can be chilling.
  4. Open the box – a browsing space on the mezzanine floor full of animations to explain what goes on behind the screen interface.
  5. Data Detox Bar – the empowerment station where people could get an eight-day Data Detox Kit (now online here) and ask Ingeniuses questions about the exhibition and issues raised.
  6. Basement area – an event space hosting a daily schedule of expert talks, films and hour-long workshops put on by the Ingeniuses.

During the curator's tour by Tactical Tech co-founder Marek Tuszynski, what impressed me most was the framing for The Glass Room. This is not a top-down dictation of what to think but a laying out of the cards for you to decide where you draw the line in the battle between convenience and privacy, risk and reward.

I handed out kit after kit to people who were unaware of the data traces they were creating simply by going about their normal connected life, or unaware that there are alternatives where the default isn't set to total data capture for future brokerage.

Some people needed talking down after seeing the exhibition, some asked how to protect their kids, others were already paranoid and trying to go off the grid or added their own stories of life in a quantified society.

***

THERE ARE THREE LESSONS I've taken away from my experience in The Glass Room to apply to any future sessions I might hold on these topics:

  1. Materialise the invisible – bring physical objects (art, prototypes, kits, display devices) so that people can interact and discuss, not just read, listen or be told.

2. Find the 'why' – most people are unaware of, or unconcerned about, the level of data and metadata they produce until they see how it is aggregated and used to profile, score and predict them. Finding out what people care about is where the conversation really starts.

3. More empowerment and empathy, less evangelism– don't overload people with too many options or strategies for resistance, or polarise them with your own activist viewpoint. Meet them where they are at. Think small changes over time.

***

IT'S BEEN A MONTH SINCE The Glass Room and I'm proud of stepping up as an Ingenius and of overcoming my own fears and 'imposter syndrome'.

As well as doing nine shifts at The Glass Room, I also ran two workshops on Investigating Metadata, despite being nervous as hell about public speaking. There are eight workshops modules in Tactical Tech's resources so it would be interesting to work these up into a local training offering if any Brummies are interested in collaborating on this.

I wrote a blog post for NESTA about The Glass Room – you can read it here: Bringing the data privacy debate to the high street.

I did the Data Detox Surgery at an exhibition called Instructions for Humans at Birmingham Open Media, and also set up a mini version of The Glass Room with some pop-up resources from Tactical Tech – there's a write-up about that here. The Ingenius training gave me the confidence and knowledge to lead this.

Leo from Birmingham ORG has also had Glass Room training so we will be looking for opportunities to set up the full pop-up version of The Glass Room in Birmingham in 2018. Get in touch if you're interested– it needs to be a place with good footfall, somewhere like the Bullring or the Library of Birmingham perhaps, but we're open to ideas.

There's also a more commercial idea, which arose at the Data Detox Surgery, to develop this as an employee engagement mechanism within companies to help make their staff more cyber-secure. If employees learn more about their own data privacy and can workshop some of the issues around data collection, then they are more likely to care about company processes around data security and privacy. In short, if they understand the personal risks, they will be more security-conscious when working with customer or commercial data.

Update: In March 2018 I launched a data privacy email for my home city – you can read all about it here.

As ever, watch this space, or get in touch if you think any of this should be taken to a coffee shop for further discussion and development. You can also connect with me on Twitter if you want to follow this journey more remotely.

Thanks for staying to the end.


Hire/commission me: fiona [at] fionacullinan.com


How to make your cybersecurity event more engaging

I'm fascinated by how cybersecurity enthusiasts and organisers present and run their events, as that seems to be crucial in (a) getting people to come along, (b) triggering action.

I attended three cybersecurity events in September – Cryptoparty London, Cy3sec and Cybersecurity for 'Real People' – and learnt a lot from how they engage, or don't. Conclusion: Infosec events need to be a LOT more practical and engaging and to deliver on what they promise. Drinks/snacks also help with after-work events.

1. Cryptoparty London

Cryptoparty London

Organised by:

A tech consultancy and a civil rights group put together the London event but this is just part of a larger decentralized movement of CryptoParties with events happening all over the world. "The goal is to pass on knowledge about protecting yourself in the digital space. This can include encrypted communication, preventing being tracked while browsing the web, and general security advice regarding computers and smartphones."

https://www.cryptoparty.in/london 

Approach

Put it in a bar, call it a 'party', have infosec-themed cocktails, offer interactive break-out workshops (on Tor browser, Bitcoin, email encryption and smartphone surveillance) and lightning talks with a stage and large screen, surveillance-based visuals, digital art and music. September was the tester – it went very well and is now going monthly.

Pros

  • Beginners welcome
  • Networking, sense of community, expert access
  • Top pedigree of speakers, eg, Silkie Carlo, co-author of Information Security for Journalists
  • A nice dark room and sociable vibe for tired people after work
  • Practical workshops, how-tos and Q&As
  • Stickers and swag on the tables

Cons

  • It's held in London – I'm in Birmingham
  • It ran way over time so I missed my second workshop
  • Logistics – bar noise/numbers made workshops hard to hear for some
  • Attendees seemed highly engaged and knowledgable already – bar too high for newbies?

Summary

CryptoParty's main objective is to "tear down the mental walls which prohibit people to even think about these topics" – on that aim, it was definitely the best for engagement and practical learning. I'm now set up on Tor Browser and just wish I could have stayed longer.

2. Cy3Sec

Organised by:

Fizzpop – a popular Birmingham-based maker/hacker group with its own workshop space. Its first cybersecurity workshop was set up on Meetup and is set to run monthly.

https://www.meetup.com/fizzPOP-Birminghams-Makerspace/events/243198601/

Approach

One presenter talking to attendees around a table, small group style. There was a tech fail on the projector front which didn't help. The speaker was a real-life locksmith so the focus was very much on how the hackers break in. The Meetup blurb said:

"The first hour will be on 'beginner' topics, then half an hour to chat, then an hour on a more advanced topic(s). If people want to do a short talk, great. There may be Bluetooth lock picking. There might be hacking a local server. A talk on decapping chips. If you've something to teach or explain about, please let us know."

Pros

  • Beginners welcome
  • Quiet workspace, easy to get involved
  • Unusual angle – locksmith/hacker, physical access to devices
  • Free-roam topics and tech nerd view (how to kill people and start wars through hacking) = an interesting experience!

Cons

  • Attendees were Fizzpop members, a brain surgeon and a someone with a Masters in cybersecurity – not exactly beginners friendly
  • Mostly a one-way talk, lots of assumed knowledge, and attack based with cybersecurity solution more an afterthought
  • Departed from promised structure and timings
  • Sense of being an outsider entering a tech nerd's member's club

Summary

I never knew where this session was going or what I was going to get or even when it was going to end. Some structure and communication would really help this session. The Fizzpop-style focus on physical hacking and USB baiting, and 'how stuff works' was way above my knowledge grade but learning how to hack could fill a useful gap if done at beginners level and with a sense of playful fun that is the Fizzpop way.

Despite the exclusive feel, I am tempted to go back – albeit with a flask of tea and some biscuits, and just enjoy the random weirdness of Fizzpop life.

3. Cybersecurity for 'real people'

Birmingham ORG cybersecurityOrganised by:

The Open Rights Group Birmingham – which runs regular events on cybersecurity and data privacy for concerned citizens. It feels more political although the offer is also practical. It campaigns to protect and promote digital rights in Birmingham and beyond. It was also set up on Meetup:

https://www.meetup.com/ORG-Birmingham/events/242706511/

Approach

The purpose was to offer practical cybersecurity advice that ‘real people’, not just digital geeks, can understand and apply in their daily lives. There were two main speakers, a large screen, a Powerpoint presentation and chairs for the audience. Although it was billed as a workshop, it was really more of an advice session/talk, with little opportunity to interact – one of the problems of running through a set of slides.

Pros

  • Beginners welcome – had the most varied mix of people of all three events
  • Darkened room for viewing slides, the acoustics weren't great though
  • Practical advice on sending secure emails and messages, password managers, Tor browser
  • Beginners friendly – idea of just 'change one thing'
  • Friendly, open, inclusive vibe
  • Resources posted on the Meetup site (Update: more resources, tips and follow-up from the session have now been posted to ORG B'ham)

Cons

  • More political stance – which may put off some; would be good to know more about the trade-offs not just follow advice blindly
  • Tried to pack too much in – people asking more in-depth questions but no time to cover
  • Top-down talk – less engaging than a practical workshop

Summary

This was my first ORG session and the organisers obviously know their stuff, but it was a skim across the surface and felt like an intro session to a longer course. I think they could increase engagement with less content and more practical focus, and as the session started at 6.30pm, maybe see if they can get sponsorship for some refreshments as most people come directly from work.

The immersive option?

Data privacy is a hard sell, even though it's one of the biggest issues of our time with surveillance and data capture growing exponentially and often obfuscated and kept out of sight.

Most people know they should do 'something' but maybe think it's too techy, or a hassle, or like me, tell themselves that they'll get around to it one day and hope they don't get sprung in the meantime. In short, there are barriers for everyone to overcome.

This next event could be the answer… and I'm pleased to report that I've managed to get a spot helping out at The Glass Room London, which opens for three weeks at the end of October.

Curated by Tactical Tech and produced by Mozilla, The Glass Room, was attended by over 10,000 visitors in New York City last year.

It is ALL about the engagement, with people coming in off the street to an immersive, dystopian tech store that exposes the state of their data privacy. Data Detox Kits will be handed out. And there will be interactive exhibits.

It looks really really good, and will be blogged.

Seven ways the Bank of England encourages a culture of cybersecurity

Bank-of-England-culture-change-security

“What is important to you?” This is the first question to ask before planning any cybersecurity strategy, according to John Scott, Head of Information Security Education at the Bank of England, talking at the recent Cybersecurity UK Roadshow event in Birmingham (notes here). Because if you don’t know what a client or company values, if you don't understand their business priorities, you can only talk in absolutes.

As Scott gently points out, 80% of people working in the Security Awareness field come from a background in IT or security and there is tendency to talk in absolutes. While things are moving towards more nuanced conversations around risk, finding people who can both listen and communicate well on this topic can be difficult. The result in many large organisations is an environment of enforced compliance; getting workers to care and engage beyond that is a tough sell.

'From compliance to culture, awareness to action' was the title of Scott’s talk. He said compliance and awareness aren’t enough; it’s building a culture of mature security that is required to stay safe. Scott then rated security culture on a scale of -1 (negative behaviours) to 0 (compliance behaviours) to +1 (security maturity and positive behaviours), and outlined the Bank's encouragement of the following ‘cyber seven’ practices to move from compliance towards maturity (more of which below):

Bank-of-England-cyber-seven

1. Passwords

0 = don't share passwords

+1 = use a password manager

2. Phishing

0 = don't click on suspect email links or open attachments.

+1 = report suspicious emails (whether clicked or not)

3. Document classification

0 = classify documents when saved into document management system

+1 = mark docs clearly, dispose of confidential documents safely

4. Clear workspace

0 = don't leave confidential material on your desk

+1 =  also check printer, whiteboards, keysafe when you leave

5. Remote working

0 = make sure you are not overlooked when working on trains

+1 = keep your remote token separately from your laptop when travelling; report loss of devices immediately

6. Social media

0 = don’t post photos of the Bank on social media or get involved in discussions related to the Bank’s mission on social media without permission

+1 = audit your social media profile – make sure you’re aware of what you and other people are saying about you.

7. Report it

0 = if you see anything that worries you, tell us – 'See it, Say it!'

+1 = if you've done something yourself or caused a problem, report it

This final point raised a lot of questions in the audience – wouldn't a major breach be a sackable offence, for example? Why would employees admit their error? Scott suggested awareness and education, perhaps telling stories about how coming forward has worked and to try to build trust with your employees.

It's always better to know that a breach or a vulnerability has occurred so you can address it but you need people to feel secure in coming forward. As the Regional Organised Crime Unit noted in their talks at the roadshow, one of the biggest issues in cybersecurity is the lack of reporting.

Thanks to John Scott and Metsi Technologies for use of the slides.

Notes from Cyber Security UK Roadshow Birmingham

John-Davies-CybersecurityA one-day event held yesterday held at Innovation Birmingham on the Aston Uni campus to help businesses get to grips with cybersecurity. It was organised by Metsi Technologies, and supported by the National Police Chiefs' Council and Regional Organised Crime Unit (ROCU) in the West Midlands. The Twitter account and hashtag was @cybersec_uk but the backchannel was pretty quiet. Here are my notes.

Cybercrime

The increasing threat of cybercrime runs across a range of levels from nation-state threats to ransomware to IP theft. There were various police chiefs in attendance and the main message seemed to be that cybercrime is massively unreported to police – with the result that sufficient budget isn’t being assigned.

Ashley Bertie, Assistant Police and Crime Commissioner for the West Midlands, sent out a plea to find out what your local police force is doing and engage with their agenda. One available resource that has just launched is the Digital PCSO (Sean Long in the West Midlands) who can go into business organisations, schools and the community and advise on security basics.

John Davies of Pervade Software then introduced the National Cyber Security Strategy, consisting of three main acronyms:

  • NCSC – the National Cyber Security Centre (at GCHQ) – pushes out national strategy.
  • CiSP – Cyber Security Information Sharing Partnership – a place to both get free advice and also report hacks.
  • CES – Cyber Essentials Scheme – certification scheme to show that a business has addressed basic cybersecurity.

Main cybersecurity threats for SMEs

Louis Augarde, lead pen tester for Omni Cyber Security, introduced these as:

  • Ransomware – disruption for financial gain
  • Credentials-based attacks – to gain an entry point
  • Breaches based on known vulnerabilities – often used as a first step to identify weak systems that can be exploited further
  • Phishing emails – to gain credentials and access
  • DDOS – freezes your system temporarily but can also be a smokescreen for more serious attacks

He also introduced me to the idea of baiting, a social engineering tactic to get hold of your personal info by leaving out a USB for people to pick up. Never plug an unknown USB found on the train into your computer!

Cybersecurity help for Birmingham SMEs

If there’s one thing for businesses to do now it is the Cyber Essentials Scheme, said John Davies. Participants address 68 questions on their cybersecurity systems around firewalls, patches, configuration, malware, user accounts and so on. The scheme costs £300 and provides an annual certificate.

The CES process is designed to prevent the vast majority of cyber attacks but also offers a badge to show that a business has made an effort to keep the supply chain more secure.

Other options mentioned include the 80-question IASME governance standard, costing £400, which also looks at data assets, risk assessments, people, policies and disaster recovery. Both CES and IASME were said to be a good foundation in securing businesses and a more achievable alternative to 500+-question ISO27001 international standard.

There is also the newly launched West Midlands Cyber Security Cluster, the 19th in the UK, and people, businesses and organisations can tap into this to get help and advice in tackling cybersec issues. The website looks as if it has teething problems right now so check back later.

Other links mentioned on the day were:

Takeaway quotes and stats

95% of all successful attacks are the result of well-known and entirely preventable vulnerabilities (various reports from 2011)

“Don’t buy the whole onion – security is best built in translucent layers” – Brian Chappell, Beyond Trust, introducing five main layers for organisations wanting simpler security (focus on the high risks, tackle lateral movements of hackers into your system, exercise privilege control, one standard user account for all, configuration management).

The first reported cybercrime was in 1820 – it was the sabotage of some newly invented tech – the Jacquard loom – that automated the weaving process. DCI Rob Harris suggested this was where the term ‘patch’ came from but I’m not convinced that is true.

“Why do they do it? I’ve sat opposite many cyber criminals in my job, some as young as 16, and their answer to this is ‘because they deserve it’.” – National Police Chiefs Council on cyber crime motivation.

“80% of people [in cybersecurity roles] have an IT or security background and they tend to talk in absolutes. You have to find people who can listen and communicate.” – John Scott, Bank of England

GDPR for businesses

Jane Burns of Anthony Collins Solicitors made a valiant attempt at an overview of this super-complicated incoming regulation from May 2018.

The EU GDPR, also being adopted in the UK despite Brexit, offers a whole different world of pain so I’m not going to get into it here but, basically, if you’re not already aware, businesses are going to have to get a whole lot better and more transparent in how they process their data, or they risk big fines, and even worse for some, being cut off from accessing their data for a period of time.

This photo may be useful…

Jane-Burns-GDPR

What does the Bank of England do?

What does the most secure place in England do to prevent cybercrime?

John Scott, Head of Information Security Education at the Bank of England, gave a great presentation on one of the biggest problems facing companies – that of lack of user engagement in an organisation's cybersecurity practices. He said compliance and awareness aren’t enough; it’s building a culture of mature security that is required to stay safe.

I enjoyed this talk so much I’m going to blog it separately.

Next event: a London CryptoParty on 11 September, a mix of cocktails and practical workshops…

 

 

The dick* pic guide to government surveillance

* and boob

I had a conversation with a family member recently about my growing interest in cybersecurity and they responded with 'I've got nothing to hide so I'm not worried'. Basically, let the government watch them if it stops terrorists; it's all good.

For someone who grew up in the 1980s Cold War (but also basically made a second career out of Web 2.0), it's about how much they are watching, centralised files, a culture of fear, lack of freedom, potential abuse of political power – and trying to understand the trade-offs of privacy versus security when we put our info out there.

I don't think I have anything to hide either – except when I do – but it's not about having something to hide, it's about having something to protect. We're not just talking about status updates knowingly shared on Facebook, Twitter, etc; the info at risk is also the stuff you think you are keeping private: phone calls, files and photos stored in the cloud, SMS, email.

Getting people to care about surveillance and infosecurity is apparently an issue, with cybersecurity events often struggling to attract an audience. Calling it infosec or cybersecurity is a kiss of death, according to a friend who runs such events. (It's true: I'm going to an evening event in London because it's a CryptoParty in a bar with beer sponsors, etc, whereas a day-long 'cybersecurity roadshow' in Birmingham was a much harder sell.)

To help with the 'who cares' issue, I finally got round to watching John Oliver's 2015 'Last Week Tonight' interview in Moscow with Edward Snowden – a deliciously awkward affair in which Oliver played a rude, dumb American asking Snowden's nice, intelligent whistleblower to explain in layman's terms ('Can I share my dick pics or not?') why they should give a shit about increasing government surveillance powers and his 2013 revelations.

If you haven't seen it, it's well worth a watch. My notes below…

Notes: Government Surveillance: Last Week Tonight with John Oliver (HBO)

  • Section 215 of the Patriot Act (created post 9/11, and extended/renewed) requires businesses to hand over 'any tangible things'(eg telephone records) to protect against international terrorism.
  • Snowden in 2013 revealed this to be used for the mass scooping up of data.
  • Government says it doesn't abuse its powers + there are restrictions on how/when they can employ surveillance, eg, through the FISA Court, which grants surveillance warrants.
  • Reality is that FISA rarely rejects an application. From 1979 to 2013, it has approved 35,434 application for surveillance and rejected only 12.
  • Snowden: "NSA has the greatest surveillance capabilites that we have ever seen. Now, what they will argue is that they dont use this for nefarious purposes against American citizens. In some ways that is true but the real problem is that they are using these capabilities to make us vulnerable to them, and then saying, well, I have a a gun pointed to your head but I won't pull the trigger – trust me."
  • Is anyone having the conversation about where the limits should be, eg, reform of Section 215. Public debate not happening (that care issue again).
  • Oliver asks if it is possible for the public to have a conversation about something that is so complicated we don't fundamentially understand it? He shows Snowden a video that shows Americans getting upset about the government sharing and looking at their dick pics. The rest of the interview is framed through this simple analogy.

Can they see my dick?

Section 702 surveillance – yes – through bulk collection if an emailed image crosses a border in some way and is caught on a database.

Executive Order 12333 – yes – the NSA uses this order when others aren't aggressive enough, so if a Gmailed pic is sent even to a fellow American, it will be stored on Google server, and Google may move this data from data centre to data centre – the US government can capture that if it moves outside of US even temporarily.

PRISM – yes – it captures your info with the agreed help/involvement of government deputies/sheriffs such as Yahoo, FB, Google.

Upstream collection – yes – they can 'snatch your junk' as it transits the internet.

MYSTIC – if describing your junk on the phone, yes. Collects content as well in some countries, eg, The Bahamas.

Section 215 metadata – no, but can tell who you are sharing junk pics with (eg a penis enlargement centre).

So what next?

Snowden says: "You shouldn't change your behaviour because a government agency somewhere is doing the wrong thing. … If we sacrifice our values because we are afraid, we don't care about those values very much."

My take is:

  • Keep doing what you're doing but send/share your stuff via more secure platforms
  • Try to understand the lay of the political and digital landscape and don't give away freedoms that are at risk.
  • Figure out the trade-offs and fight back against government surveillance where it is an invasion into privacy/freedom – I'm not saying terrorist and other threats shouldn't be addressed, of course not, but scaling up government powers shouldn't be done thoughtlessly or in knee-jerk reaction to modern threats without a thought for historical ones that threaten all our civic freedoms. Debate publicly and find the line.

Since Snowden… a visit to Infosecurity Europe 2017

Fiona Cullinan, Infosec Europe 2017

'Since Snowden' has become a bit of a catchphrase for me after his revelations in 2013 about the mass government surveillance of our data. Since Snowden I've watched Citizenfour, read The Snowden Files, completed two OU cybersecurity courses, joined ORG Birmingham, learnt how to use PGP encryption, risk-audited my personal info and started putting some basic processes in place so I am more in control of my data.

This is something I hope to starting helping other people with, so if you have a question about passwords managers or how to risk-assess your info, for example, get in touch. I'm still learning so it's basic guidance only and probably best done at a friendly local level than in any official capacity.

Last month I also attended two days of Infosec Europe, the largest event of its kind in Europe featuring a conference programme, 360+ exhibitors and around 15,000 visitors. It was very much aimed at larger organisations and since I'm at the individual and SME level, there was some disconnect.

That said it was probably one of the best conferences I've attended outside of SXSW and I came away with a lot of info and contacts – enough to know that this is going to remain a definite interest of mine for some time to come.

So I've started a Twitter list of Women in Infosec because I missed that session at #infosec17.

And collected a few conference links for reading and reference:

Hello Infosec World.