Home to Geordie Stewart's blog on information security awareness, risk communication and security ethics.

Risk IntelligenceRisk Intelligence
Risk Intelligence
Information Security Awareness
  • Home
  • About us
  • Services
  • Awareness Blog
  • Follow Us
  • Contact us
Menu back  

Criminals and Moral Codes

Rumor Has IT – Fake News and Cyber Security

Five Minutes With Lance Spitzner

Security Awareness Tips From A Social Engineer

The Craziest Information Security Stories of 2016

7 Habits of Highly Successful Security Policies

Keeping IT Simple

Polluting The Privacy Debate

About this blog

Martin Luther King said ‘I have a dream’, not ‘I have a plan’

– Simon Sinek

Engaging end users using marketing, psychology and safety theory.

Popular posts

Getting Permission To Use HaveIBeenPwned From Your Legal Dept
4th April 2018
The Craziest Information Security Stories of 2017
4th January 2018
Rumor Has IT: How Fake News Damages Cyber Security
7th June 2017
The Craziest Information Security Stories Of 2016
11th February 2017

About Geordie Stewart

Geordie Stewart, MSc, CISSP, is an international speaker and keen innovator in the area of technology risk communication.

His award winning masters thesis at the Royal Holloway Information Security Group examined information security awareness from a fresh perspective as a marketing and communications challenge. In his regular speaking appearances at international information security conferences such as RSA, ISACA and ISSA he challenges conventional thinking on risk culture and communication.

In addition to senior security management roles in large UK organisations Geordie writes the security awareness column for the ISSA international journal.

FacebookTwitterPinterestGoogle+Instagram

Awareness Blog

View allBlogConferencesFeaturedMental ModelsOrganisational CulturePrivacyRisk CompensationRisk PsychologySafetySecurity AwarenessSecurity EconomicsSecurity MetricsSurveillanceTrust
Jan62015

Keeping IT Simple

The landing gear light indicated a problem. The captain, first officer and flight engineer of Eastern Air Lines Flight 401 tried to figure out what was wrong. They removed the light assembly and the flight engineer left his position to go to the avionics bay and investigate. They were so preoccupied with a burnt out…

Details
Leave a commentBlog, Safety, Security AwarenessBy Geordie
Dec42014

Leveraging Existing Audience Beliefs

When it comes to security awareness, there’s no such thing as a blank canvas. Your audience will already have pre-conceived notions about your topic. The language, tone and media you use will invoke associations in people’s mind, both helpful and unhelpful. These associations will influence how people view the root causes, likelihood and potential outcomes.…

Details
Leave a commentBlog, Mental Models, Risk Psychology, Safety, Security AwarenessBy Geordie
Mar22013

ISSA Security Awareness Column Jan 2013 – Bad Apples in Big Barrels

There’s no denying that some people are impervious to our attempts at security awareness and refuse to listen to warnings or instructions. There is a temptation when things go wrong to label such people as ‘bad apples’. I think that this saying is overused. Originally, the expression ‘bad apple’ referred to a rotten apple in a barrel that would spoil the good apples. Usage of the phrase has changed and its now often used to explain failures of scale. The perception is that when there are many apples you have to expect some of them to be bad.

I often hear the phrase used when a governance failure is attributed to human mistakes. Frequently however, I think the phrase bad apple is a convenient cover for poor management where processes and procedures were badly designed or supervised. The bad apple narrative can suit prejudices of humans being a weak link and any narrative is more comforting than no narrative at all. However, bad apple narratives rarely withstand serious scrutiny.

Details
Leave a commentBlog, Organisational Culture, Safety, Security Awareness, TrustBy Geordie

ISSA Security Awareness Column October 2012 – Learning From Safety Risk Communications

Any endeavour is made doubly difficult when pursued with a lack of metrics and without a clear understanding of cause and effect. When stumbling in the dark, facts are the flashlight of comprehension, illuminating the way forward when the path is unclear. Information security is often required to function in the dark with little in the way of facts to guide us. We hear noises and bump into things but can never be certain if we’re going in the right direction.

When security fails, how do we know? While failures of integrity and availability are obvious, failures of confidentiality can be silent and insidious. Some actors such as LulzSec boast about their exploits and derive their benefits from the resulting publicity. Other actors quietly go about their ‘business’ and organisations may not realise they’ve been breached. Often, even when we do discover failures of confidentiality the organisational interest is to bury it. As a result, our profession is rich in rumours but poor in facts which make it difficult when trying to understand the effectiveness of security controls.

Details
Leave a commentBlog, Risk Psychology, Safety, Security MetricsBy rskadmin
Risk Intelligence
Copyright © 2015 Risk Intelligence Ltd.
  • Home
  • About us
  • Follow Us
  • Contact us
Footer