CCS
2005
Tutorials
[ Tutorial I || Tutorial II || Tutorial III || Tutorial IV ]
Tutorial I
Common Ways Cryptography is Mis-used and How to Get It Right
14:00 -
15:30 PM
|
Cryptography is now sufficiently mature that we can use standard
algorithms with high confidence that they are secure. However, as we
discuss, it is very easy to take a perfectly good cryptographic
primitive and use it in a completely insecure way. We survey examples
where this has happened and extract valuable lessons on how to avoid
repeating the mistakes of others. Topics include random number
generation, (mis-)using counters, side-channels, and Do-it-Yourself
cryptography. |
| John Black is Hudson Moore Professor of Computer Science at the
University of Colorado at Boulder. Prior to this, Dr. Black was a
developer in the database industry where he worked on a variety of
security-related projects. Dr. Black's research interests lie
primarily in cryptography and cryptanalysis, particularly in the
construction of fast and provably-secure algorithms and in the
analysis of cryptography applied to networks and computer systems. For more information: www.cs.colorado.edu/~jrblack |
|
Deception is an appealing means for computer network defense
(CND), as it pits the defender's strengths against the hacker's
weaknesses. Hackers rely heavily, if not exclusively, on a single
source of information--network data. The data is easily manipulated,
and the hacker is highly vulnerable to deception. The defender has
physical control of the network, and he knows the network well.
Further, deception can be used to attack hackers' decision-making
processes; thus deception provides an offensive
security-measure--something computer security defenders sorely lack. This tutorial explains how deception operations can be designed and developed for CND, including incident response, intelligence, detection, and prevention. Deception processes, principles and techniques are presented. They are based on the underlying nature of deception and the extensive military deception-literature. The presentation focuses on enduring principles that are of use for conducting deception operations. Applications to honeypot systems are also provided, as they are currently the most widely used deception. |
| Fred Feer, is retired from a career with the U.S. Army
counterintelligence, CIA, RAND and independent consulting. Deception
has been an interest and area of professional specialization for over
40 years. Email: ffeer@earthwave.net Jim Yuill is a senior PhD student in the Computer Science Department at North Carolina State University. His thesis topic is the same as the proposed tutorial's: "Deception Operations for Computer Security : Processes, Principles and Techniques". Jim has taught over a dozen graduate and undergraduate computer courses, and he previously worked at IBM in operating systems development. Email: jimyuill@pobox.com For more information: www4.ncsu.edu/~jjyuill/Professional/index.html |
|
Software Vulnerabilities are common and pervasive. As software becomes
increasingly critical to our infrastructure, it is important to
understand and learn from past coding errors. Software companies are
recognizing the effect that vulnerabilities pose to the customer, and
their own market success. Major software companies have begun to
require that their software developers take courses specifically on
secure coding practices. This tutorial provides an in depth description of common programming errors that can lead to string overflow, integer overflow and format string vulnerabilities; illustrates how these programming errors can be exploited; and provides practical mitigation strategies. The is designed for software developers and security teams for large organizations. It assumes some knowledge of C and C++ programming but is instructed in a manner that the audience members who do not have a complete background in this area will still be able to benefit from it. |
|
Robert C. Seacord is a senior vulnerability analyst at the
CERT/Coordination Center (CERT/CC) at the Software Engineering
Institute (SEI) in Pittsburgh, PA and author of Secure Coding in C and
C++ (Addison-Wesley, 2005). An eclectic technologist, Robert is
coauthor of two previous books, Building Systems from Commercial
Components (Addison-Wesley, 2002) and Modernizing Legacy Systems
(Addison-Wesley, 2003), as well as more than 40 papers on software
security, component-based software engineering, Web-based system
design, legacy-system modernization, component repositories and search
engines, and user interface design and development. Jason Rafail is a vulnerability analyst at the CERT/Coordination Center (CERT/CC) at the Software Engineering Institute (SEI) in Pittsburgh, PA. Jason coordinates the analysis of vulnerabilities with major vendors and industry leaders to resolve and disclose vulnerability information to the public and private sectors. Additionally, he represents the CERT/CC in efforts to promote international relations and collaboration activities. |
|
To address the needs for security education, many universities have
incorporated computer and information security courses into their
undergraduate and graduate curricula. These courses teach students how
to design, implement, analyze, test, and operate a system or a network
with the goal of making it secure. Pedagogical research has shown that
students' learning is enhanced if they can engage in a significant
amount of hands-on exercises. Therefore, effective laboratory
exercises (or course projects) are critically important to the success
of computer security education. This tutorial first gives an overview of the existing laboratory designs adopted by many universities; then we will focus on an approach (the iSYS approach) that we have been experimenting with for four years. The iSYS approach uses an Instructional operating SYStem, Minix, as a underlying lab environment to design exercises for the system-oriented computer security education. This approach is motivated by the success of the similar approach in the traditional courses, such as Operating Systems, Compilers, and Networking. The iSYS laboratory projects consists of two parts. One part focuses on design and implementation. This part requires students to add new security mechanisms to the underlying Minix system to enhance its security. The security mechanisms students need to implement include access control, capability, sandbox, and encrypted file systems. In the second part of our projects, students are given a modified Minix operating system that contains a number of injected vulnerabilities. They need to identify, exploit, and fix those vulnerabilities. Although the lab exercises require the need to change an operating system, all of them can be conducted in a general computing environment without a need for the superuser privilege; moreover, most of the exercises can be finished within 2-3 weeks. More information about iSYS can be found at www.cis.syr.edu/~wedu/SCIENS/seed/iSYS.html |
| Wenliang (Kevin) Du is an Assistant Professor in the Department of Electrical Engineering and Computer Science at Syracuse University. He got his Ph.D degree from the Computer Science Department at Purdue University in 2001. His research areas focus on three areas: privacy-preserving data mining, sensor-network security, and using instructional operating systems for computer and network security education. |