Bruce Schneier

 

Blog

Crypto-Gram Newsletter

Books

Essays and Op Eds

News and Interviews

Audio and Video

Speaking Schedule

Password Safe

Cryptography

About Bruce Schneier

Contact Information

 

Schneier on Security

A blog covering security and security technology.

« High-Quality Fake IDs from China | Main | Cyberwar Treaties »

June 13, 2012

Teaching the Security Mindset

In 2008, I wrote about the security mindset and how difficult it is to teach. Two professors teaching a cyberwarfare class gave an exam where they expected their students to cheat:

Our variation of the Kobayashi Maru utilized a deliberately unfair exam -- write the first 100 digits of pi (3.14159...) from memory and took place in the pilot offering of a governmental cyber warfare course. The topic of the test itself was somewhat arbitrary; we only sought a scenario that would be too challenging to meet through traditional studying. By design, students were given little advance warning for the exam. Insurrection immediately followed. Why were we giving them such an unfair exam? What conceivable purpose would it serve? Now that we had their attention, we informed the class that we had no expectation that they would actually memorize the digits of pi, we expected them to cheat. How they chose to cheat was entirely up to the student. Collaborative cheating was also encouraged, but importantly, students would fail the exam if caught.

Excerpt:

Students took diverse approaches to cheating, and of the 20 students in the course, none were caught. One student used his Mandarin Chinese skills to hide the answers. Another built a small PowerPoint presentation consisting of three slides (all black slide, digits of pi slide, all black slide). The idea being that the student could flip to the answer when the proctor wasn’t looking and easily flip forwards or backward to a blank screen to hide the answer. Several students chose to hide answers on a slip of paper under the keyboards on their desks. One student hand wrote the answers on a blank sheet of paper (in advance) and simply turned it in, exploiting the fact that we didn’t pass out a formal exam sheet. Another just memorized the first ten digits of pi and randomly filled in the rest, assuming the instructors would be too lazy to check every digit. His assumption was correct.

Read the whole paper. This is the conclusion:

Teach yourself and your students to cheat. We’ve always been taught to color inside the lines, stick to the rules, and never, ever, cheat. In seeking cyber security, we must drop that mindset. It is difficult to defeat a creative and determined adversary who must find only a single flaw among myriad defensive measures to be successful. We must not tie our hands, and our intellects, at the same time. If we truly wish to create the best possible information security professionals, being able to think like an adversary is an essential skill. Cheating exercises provide long term remembrance, teach students how to effectively evaluate a system, and motivate them to think imaginatively. Cheating will challenge students’ assumptions about security and the trust models they envision. Some will find the process uncomfortable. That is OK and by design. For it is only by learning the thought processes of our adversaries that we can hope to unleash the creative thinking needed to build the best secure systems, become effective at red teaming and penetration testing, defend against attacks, and conduct ethical hacking activities.

Here's a Boing Boing post, including a video of a presentation about the exercise.

Posted on June 13, 2012 at 12:08 PM • 58 Comments

To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.

Comments

Juxtapose this with the maxim "any person can invent a security system so clever that she or he can't think of how to break it."

Posted by: Jack at June 13, 2012 12:22 PM


What is most surprising to me is how simplistic most of the solutions were. Would have expected more sophisticated approaches, though obviously the students didn't have much time. Also surprised noone was caught-- I wonder: was the proctor aware that all the students were expected to cheat?

I look forward to reading the study tonight after work. Thanks the the thought-provoking post (as usual.)

Posted by: Joseph R. Jones at June 13, 2012 12:44 PM


So, since the point is to teach security, if I actually managed to learn 100 digits of pi, (which is quite easy if you know mnemonic techniques,) then I'm cheating by not cheating?

Posted by: Oops at June 13, 2012 12:44 PM


I don't see how this teaches students the Security Mindset. I thought the Security Mindset is about designing things by yourself to be secure by thinking like a cracker, not simply trying to think like a cracker for cracking other people's designs.

Posted by: anonymous moose at June 13, 2012 12:53 PM


@anonymous moose. Which is more effective, ask 20people to list ways in which an exam can be made secure - or ask 20people to find ways to cheat?

Posted by: NobodySpecial at June 13, 2012 1:12 PM


I am a recent Information Security & Assurance graduate and I think this exercise, as well as any of the others written about by these instructors, would have been incredibly useful, interesting, and exciting to be apart of.

However, I find that this practice would have to be carefully monitored and I honestly don't see many Universities allowing instructors to perform this. As intuitive as the overall message here is, the practice is in direct violation of any Universities' policy.

Posted by: Alex at June 13, 2012 1:19 PM


@Oops: yes, I thought that too. But better still, really memorise 100 digits of pi, write them from memory in the exam, and afterwards, claim to have cheated – but never let on how. With any luck the examiners will conclude that your cheating method was the best of the lot, because not only did they not catch it at the time, they still can't figure out what it was!

Posted by: Simon Tatham at June 13, 2012 1:22 PM


I'm amused that I've read 2/3 of their recommended fictional reading. They seem to have good taste, so I should take a look at "Critical System Error".

Re: University prohibition against cheating. I'd posit that if your social engineering skills aren't up to finding a loophole in the university policies, you aren't qualified to teach the course in the first place.

Posted by: Kevin Granade at June 13, 2012 1:47 PM


This raises interesting religious and moral questions for those of us in the field. It seems similar to questions related to being a soldier: "Does my religion allow me to kill in war?"

A similar question could be raised here: "Do my religious beliefs about lying and cheating prevent me from being an effective infosec professional?"

Well, at least these are questions that come up in my head.

Posted by: Sean at June 13, 2012 1:49 PM


@Oops: That's how Naruto did it, IIRC.

Posted by: No One at June 13, 2012 1:50 PM


Kids these days. When I was in college, all of my friends knew at least 50 digits of pi. What has Google done to us all?

Posted by: John Chew at June 13, 2012 2:01 PM


"Cheating exercises provide long term remembrance, teach students how to effectively evaluate a system, and motivate them to think imaginatively."

This paper is the same as just about every other paper on "the pedagogy of X" that I've read -- and as a computer science professor, I've read a lot of them: sweeping claims (such as the ones above) about the authors' techniques, with not a shred of data to support those claims. It's all anecdotal and speculative.

To demonstrate that their approach really is effective, the authors should have done something like this: Take 20 students who had gone through this cheating exercise (the test group) and 20 students who had not (the control group). A year later, perhaps in a successor course, have the 40 students do an exercise where they would have to demonstrate the adversarial mindset, such as developing some secure system. Evaluate the actual security of the students' systems. See if there is any significant difference in security between the test group's systems and the control group's systems. If so, then and only then would the authors be justified in making the claims they did.

Have the authors done such a rigorous scientific study? No. Will they ever do this? I doubt it. If they did, would the results be significantly different between the two groups? Again, I doubt it.

Sure, the technique of "controlled cheating" exercises to teach the adversarial mindset is provocative, and makes for a titillating paper. But I don't see any evidence that the technique at all helps to achieve the goal.

Posted by: Alan Kaminsky at June 13, 2012 2:17 PM


I don't understand how the cheating part is so difficult when you're given a computer to assist.

Posted by: Captain Obvious at June 13, 2012 2:25 PM


@Alan Kaminsky
100% agreement.

Not only that but by the time the kids have gotten to college I would guess that they've already had experience either cheating or seeing someone cheat.

And those past experiences were far better than the scenario outlined in the paper. Simply because they probably had real consequences for being caught.

As for the "adversarial mindset", that's easy. That's taught in most sports. You look for weaknesses in the other team's defense and you try to exploit those weaknesses.

Moving that to computer security should be just as easy.

Teams A and B defend and attack simultaneously while still providing remote services to team C.

Anything short of threats of harm to people is allowed.

Posted by: Brandioch Conner at June 13, 2012 2:42 PM


Fatal System Error by Joseph Menn

Posted by: James B Robinson at June 13, 2012 2:49 PM


That is a useful skill for software testers even outside the realm of security testing.

Posted by: Bill Smith at June 13, 2012 3:14 PM


When I was in high school latin class, one student sat in front of the class bulletin board with clippings and stuff tacked to it. He put his cheat sheet on the bulletin board, and nobody noticed.

Posted by: Phil at June 13, 2012 3:37 PM


Why would a university policy against cheating be a problem? It's a little meta but clearly the test was to successfully demonstrate a way to cheat on an exam. It clearly wasn't to actually produce those digits since the guy who made up an answer passed.

However I would have caught 'oops' - it's not that much more effort to also check the last few digits. Even three digits would be enough to reduce the odds of a successful cheat to 1-in-1000.

Posted by: Bear at June 13, 2012 3:49 PM


I have never cheated on any exam. I was never tempted. If the material was good, it was well worth learning it. If not, I avoided the exam (sometimes possible) or just got a bad grade. I find cheating is more effort and risk than it is worth. Studying the examiners, what their preferences are, what their exam style is, brought me into the top 5 students of 240 in my year when I did my Informatics Diploma (MsC). Some people even have described me as the most honest person they know and the most hard to corrupt. (I don't think that is a fair assessment. While I may be hard to corrupt, I will lie on occasion, when I see a good reason for it.)

On the other hand, I have absolutely no problem with the security mindset. There may be an element of not wanting to do it in real-time, but I believe I could cheat well under real-time conditions by now.

My point is, while this will work well for some students, it may fail for others. (Of course I may just be distant enough from the norm for it to not matter. Apparently "normal" people lie and cheat all the time, even without good reason or significant benefits...)

However, as with any true learning, the student has to discover his/her own style of learning and has do advance his/her way of approaching problems by themselves. The described exam is a bit more of a stunt and less of teaching. If it was necessary to create awareness in the students first, it may have some value. Otherwise I would advise activities were students design security measures and then break their own and that of some others.

Posted by: Gweihir at June 13, 2012 5:54 PM


@Jack: I find it difficult to design security systems that I cannot break. True, there are some details, like, for example an input data check, but that may actually not be breakable. For more complex things I find that because of the border conditions (there are always some), I make trade-offs and know well how to break them, but achieve prohibitive effort. There is also an aspect of time. I sometimes find bugs in my engineering by thinking about the problem again, anywhere from days do years later.

So, no, I think that maxim is only true for pretty bad designers. Of which there are a lot, admittedly.

Posted by: Gweihir at June 13, 2012 5:59 PM


However much I like the comparison with the Kobayashi Maru, I'm totally with Alan and Brandioch on this.

Cheating does not make up for lack of talent, knowledge or creativity and doesn't turn a Salieri into a Mozart. As every professional in any line of work knows - whether he be an artist, an engineer or a humble pastry baker -, the difference between great and mediocre is determined by thinking out of the box and enlarging your reference frame. Cheating has got nothing to do with it.

Years ago, the company I was then working for had won a contract with a a government customer whereby the existing IBM AIX install base was replaced by Sun Solaris machines. Unfortunately, the IT manager was less than happy with this and was hell-bent on kicking everything Sun back out. As the salesreps had committed themselves that Solaris could do everything AIX did and more, at one point he had found that the native Solaris printing subsystem was unable to print to a series of plotters for which no printer drivers were available. Developing custom drivers was not an option, and none of the techies could find a workable solution. Managers and salesreps alike were desperately trying to find loopholes in contracts and T&C's to find a way out. Until one guy came up with the simple idea of replacing the native printing subsystem by CUPS, solving the then already seriously escalated issue in about half an hour. Now that's the difference between cheating (loopholes) and out-of-the-box thinking.

To put it simple: the thesis of these two professors IMHO is the wrong answer to the right question. But I'm less than surprised that this sort of ideas is being launched in a context of "cyber war", which to a lot of folks today seems to be a new and emerging battle ground where no rules or treaties apply. I find that a really scary idea. It's back to mustard gas and "take no prisoners".

Posted by: Dirk Praet at June 13, 2012 6:32 PM


Many sins can be concealed under the rubric of a training exercise.

As for University policy, it's not an actual "exam" -- it's a lab experiment using the classroom itself as the lab, with the consequences for failure being a lower grade.

3.14159265358979323846 (from memory)

Better yet, assign two lab exercises; one where half the students are proctors and half the students are students/cheaters, then swap a week later.

Rig the room for lots of covert video and audio, then present the finding(s).

I recall an incident at UC where a student was caught cheating red-handed in a large undergraduate lecture class. He boldly challenged the TA and asked, "Do you know who I am?"

The TA, puzzled, said "No."

So the student immediately buried his blue book in the pile of student blue books being turned in, and fled stage right. Cheating successful, failure to detect identity of cheater.

Posted by: Andrew at June 13, 2012 8:02 PM


@Gweihir - at what german unversity did you study?

Posted by: Fastes diplom student in karlsruhe at June 13, 2012 8:51 PM


@Joseph R. Jones "What is most surprising to me is how simplistic most of the solutions were."

They had to actually use the solutions they came up with, so of course they favored the ones that succeeded with the least effort (and if their first few ideas were hard to implement, they kept thinking until they came up with something easier).

It occurs to me that while unethical people attack anything whose defeat is high profit/low risk, ethical people are attracted to finding creative attacks against 'interesting' (i.e. difficult) targets, as a sort of puzzle-solving game.

Thus ethical people who think about these things might overestimate the likelihood that a complex attack is useful or necessary.

Posted by: pfogg at June 13, 2012 9:50 PM


How ironic. In a fast changing, complex, arcane field where skills acquisition is supremely important, you boil education down to a cute exam trick. As others have pointed out, there's more to security and security education than gimmicks. There is no algorithm for good security practice, much less any bumper sticker.

Or maybe ... just as the students were told not to take the exam on its face, maybe the post is itself a clever ruse, designed to hide a deeper truth ...

But I doubt it.

Here some more thoughts on the art of security and management generally: lockstep.com.au/blog/2010/12/21/...

Posted by: Stehen Wilson at June 13, 2012 11:51 PM


@Alan Kaminsky I agree with you larger point. The article is interesting to read, but there is nothing to prove that such exercise taught them something. (But it is cool exercise and does no harm.)

However, the test you suggest would fail almost any part of any curriculum. A single few hours long part of the curriculum is rarely so influential. You can remove it and the end result will be largely the same.

Remove too many seemingly useless lessons and you will end up with very watered down curriculum. (I think that this is how dumbing down of schools happen.)

Posted by: aaaa at June 14, 2012 1:59 AM


Waiting for Clive's post on this...

Possibly along the lines of reprogramming a TI calculator/ Timex watch to display all 100 digits of Pi :-)

Posted by: AC2 at June 14, 2012 2:46 AM


@no one
The episodes where Naruto goes through a test like this, are 23 - 25 of original series. Currently available at
www.crunchyroll.com/naruto

Posted by: naruto fan 28 at June 14, 2012 3:09 AM


@Andrew: The "do you know who I am" anecdote is a widespread "university legend" or just a joke which can be heard retold all around the globe in many versions and languages.

Posted by: Peter A. at June 14, 2012 3:12 AM


@AC2

Yeah, he must have a very fulfilled life, given all the things he tried, made, witnessed personally, and always perfectly fitting the specific blog post. With all this knowledge and experience, I'm still wondering why he doesn't set up his own blog and become a known security expert on his own.

When I may have too much time on my hands and are really bored, I may read all his comments here and search them for contradictions, e.g. being at two distant locations at the very same time, to prove he made all this stuff up.

:-)

Posted by: A. N. Onymouse at June 14, 2012 3:18 AM


@Andrew and @Peter A.: The meme (appropriate designation...?) was used in a TV ad a couple of years ago. Should be on Youtube but don't recall the brand... Non-effective advertising...

Posted by: Jurgen at June 14, 2012 3:55 AM


Edit: Found it. www.youtube.com/watch?v=r30rw64OlDA Now apply this lesson to infosec.

Posted by: Jurgen at June 14, 2012 3:58 AM


@Peter A.: It is a widespread legend, but that doesn't keep people from trying it and occasionally being successful. I've seen it work in a Math exam in the first semester (in this case he wanted to turn in his test after the time had passed).

Posted by: Autolykos at June 14, 2012 4:11 AM


I'd say that if 20 students were cheating, and most of them with something as simple as a sheet of paper under the keyboard, something's wrong with the proctoring.

And that no instructor actually checked the answer beyond ten digits? Um.

J.

PS - Finally, the government teaching university students in cyberwarfare should be at least protested as much as having ROTC on campus. J.

Posted by: Jon at June 14, 2012 5:05 AM


Reminds me of Frank Abagnale and similar 'I'm a pilot/surgeon/police -- catch me if you can' people.

What mechanism draws the line before 'I'm a cybercrime security specialist' is reached?

Or indeed a 'university professor'?

Posted by: Anders Thulin at June 14, 2012 6:32 AM


In reading the comments I'm awestruck by the number of people who eschew this test because it is not an all encompassing course on how to be an info-sec pro.

The purpose of this test was to help the students gain perspective. To replace their paradigm. To expand their mindset.

And to, ugh, quote Apple "Think different".

Posted by: BJ at June 14, 2012 6:56 AM


@BJ, the post was grandly titled "teaching" the mindset, rather than something more modest like "illustrating". You go even further, suggesting the exercise will "replace their paradigm".

I think many of the comments are in the nature of warning against generalisations drawn from small samples and ill designed experiments.

Surely the very circularity of prescribing cheating limits the usefulness of this sort of test if it were ever to become institutionalised.

Posted by: Stehen Wilson at June 14, 2012 7:33 AM


In Farrar's "Eric", the students take it in turns to build a crib sheet for the next test. They just stick it to the front of the teacher's desk so they can all copy from it.

Doubles up as a group version of the prisoner's dilemma.

Posted by: bob at June 14, 2012 8:59 AM


Actually, I am going to disagree with some here. Teaching a security mindset is overblown. Some people "get" it and some don't. I think you can teach it to some degree but many will never be skilled at it.

I found the test highlighted to be cute. I would have approached from a historical standpoint. What did U.S. do to get the Yorktown back up and turned around in 72hrs? Or for Clive, I seem to remember Churchill and chaff. I would be curious to see what (I know it's too easily turned into junk science) what nationalities/personal traits may be extrapolated.

Some people are going to be inclined and even gifted in security mindset. You can hone it and guide it, but still some will be horrible at it. I.E. I can tell you how to fix/do something, but you DO NOT want me welding a power tool. Actually, I may wind up welding a power tool to the workbench. Microsoldering yep, chainsaw, No...

Posted by: jacob at June 14, 2012 9:35 AM


Interesting exercise. I'd love to see an iterative process, where on the second attempt either a) it is explicitly stated nobody is allowed to use a cheat that had been previously used, or b) it's not explicitly stated, but the proctor is given a list of the first exercise's cheats and thus is on the lookout for those things.

Also, not sure if it's really spelled out in the full article (haven't read it yet), but it sounds like you could boil down the lesson into three points:

  • Understand the system
  • Understand the flaws *in* the system
  • Understand how to go *outside* the system
  • Posted by: Quirkz at June 14, 2012 10:00 AM


    @Jurgen, thanks for the video link. I'm sure it's happened many times in many places - but in the specific incident I recall in the mid 90s at UC (forgive my vagueness), the pile of blue books was on the floor, the challenger was a TA and not a professor, and no fruit was harmed. Also, the student had cheated by copying from another student. The TA asked others if they recognized the student and received no answers.

    Posted by: Andrew at June 14, 2012 10:11 AM


    @Dick Praet,

    I agree with you that thinking out of the box is an innate talent that is lacking for the task for most designers in infosec, because of the complexity of the challenge.

    Guy Kawasaki said, "those on the first curve are unable to comprehend, let alone embrace the second curve."

    If that is indeed the case - that people can't even "recognize" real innovation if it jumps the curve, than who has the ability to think out of the box and come up with one, because one tends to start with what one already knows, and builds on it ???

    Posted by: Rob Lewis at June 14, 2012 10:29 AM


    Hey, "A. N. Onymouse," interesting post. We know that you're really Clive, though.

    Posted by: Thunderbird at June 14, 2012 11:13 AM


    As an engineer (retired) in a very large internationally competitive field. I've had to encourage the same "cheat" mindset to design and build various machinery and to bypass someone else's patents. The result is no other choice but generate a creative or 'work around ' solution. We as engineering managers want our team to come up with innovative ideas. The same thing we want our students to do. Our perceived problem was administrators that would rather pay royalties than take risks.

    Posted by: S.W. Duck at June 14, 2012 12:49 PM


    @Thunderbird

    "A.N. Onymouse" couldn't be Clive. All the words in the post were spelled correctly. :-)

    Posted by: Rookie at June 14, 2012 2:53 PM


    @Fastes diplom student in karlsruhe: Karlsruhe as well. I might know you ;-)

    Posted by: Gweihir at June 14, 2012 5:48 PM


    Lol awesome. Some kids in my university used to cheat by mail ordering 'IR contact lenses' from online gambling cheater sites and writing info with 'invisible ink' all over their arms, hands, legs and whatever else they could use. Nobody was ever caught, bet they still use this method

    Posted by: Awesome at June 14, 2012 7:29 PM


    I'm amazed that with all these comments about security mindset vs cheating I have not seen it mentioned that a Cheater ONLY really needs to find ONE good way to cheat, however a good security person has to defeat EVERY conceivable cheating method. It is kinda asymmetric...

    So showing people how easy it is to develop one workable cheat does not in any way show the breadth of workable cheats. A really good cheater with 5 or 6 different proven / perfected cheat methods would still be only an average security professional.


    Posted by: RobertT at June 14, 2012 10:02 PM


    Iteresting I did a course on negocaition a while back and one excersise was the red blue game and as all of the course atendees where from BT System Enineering we spent ages trying to find flaws/optimal stratergy in the game to win it.

    Though should you not realy be teaching Meta game thinking

    Does this mean that OGA's should be looking for long time role players familiar with min maxing and power gaming.(flutters eyelashes in a hopefully seductive way - Sir Harry you know you need a repacement for that poor PFY that brought the farm last year )

    Wish I had kept the narked email i got from one of the founders of a well know gaming company when I developed an interesting character by taking one aspect of thier world to its logical conclsuion.(werewolf rights activist)

    I did like my idea of werecats doing Lo Lo from C130's without paracutes though - cats land on their feet right.


    Posted by: neuromancer at June 15, 2012 11:05 AM


    A fascinating study, but I think that the real value of the lesson would have been during the debrief, not during the test itself. During the test you are only exposed to one way (or two ways) to cheat, but during the debrief you can learn 20+ ways to cheat - as @RobertT noted, one way to cheat is often not enough.

    Posted by: John E. Bredehoft at June 15, 2012 12:47 PM


    I wonder if I am the only one who would have already known the first 100 places? I would then have acted highly suspiciously, but not obviously using any particular method to cheat. When challenged, I would have refused point blank to reveal how I had cheated, unless given something substantial in return, such as a guaranteed 10-year tenure at the Uni. in a teaching position.

    Sometimes the greatest "cheat" is pure dumb luck! For a similar reason I have learned how to correctly pronounce "Llanfairpwllgwyngyllgogerychwyrndrobwllllantysiliogogogoch" (The Welsh Village) just to be absolutely insufferable someday to someone who is boring me...

    Posted by: TeaKaye at June 15, 2012 7:55 PM


    Surprised at how many commenters got hung up on the term "cheating" here. It's not cheating if you are told that the expectation is to find some other way to provide the answer than actually remembering the answer - so it is not in violation of anything at all. What it is teaching is adversarial thinking, which is a foreign concept/practice for most developers. Speaking from personal experience as an Architect, most developers are focused on how to make something work, and inherently think along lines of cooperative processes/behavior. After all, one cannot make something work if the pieces are one uses are misbehaving. It is a natural, and productive mindset. But security issues are inherently working in a world of adversarial relationships

    gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.