Computational thinking is not just or all about computer science. The
educational benefits of being able to think computationally--starting with the use of
abstractions--enhance and reinforce intellectual skills, and thus can be transferred to
Computer scientists already know the value of thinking abstractly, thinking at
multiple levels of abstraction, abstracting to manage complexity, abstracting to scale
up, etc. Our immediate task ahead is to better explain to non-computer scientists what
we mean by computational thinking and the benefits of being able to think
Jeannette Wing is head of the Computer Science Department at Carnegie Mellon
University and the President's Professor of Computer Science. She earned her
bachelor's, master's and doctoral degrees at the Massachusetts Institute of Technology
and has been a member of the Carnegie Mellon faculty since 1985.
From 2007 to 2010, Wing served as assistant director for the Computer and
Information Science and Engineering Directorate of the National Science Foundation.
She is a fellow of the American Academy of Arts and Sciences, the American
Association for the Advancement of Science, the Association for Computing
Machinery and the Institute of Electrical and Electronic Engineers.
Navigating, Learning and Capturing the Latent Sematic Pathways in an
E-mail, while originally designed for asynchronous communication, now
rolodexing and archival storage. Many users suffer from excessive email and attempt
to alleviate the problem with a personal categorization or foldering scheme. However,
given the sheer volume of email received, manual categorization does not serve as a
viable solution. Any attempt to redesign email communication to better suit its
current tasks will be in tension with the legacy epistemology that a user has of her
Inbox. I propose a system that will enable multi-dimensional categorization, two
example dimensions being social networks and action items. The system attempts to
discover latent semantic structures within a user's corpus and uses it to perform email
categorization. A user's social network is an example of an underlying semantic
structure in an email corpus. The unsupervised message classification scheme
developed is based on discovering this social network structure. The system extracts
and analyzes email header information contained within the user corpora and uses it
to create a variety of graph based social network models. An edge-betweeness
centrality algorithm is then applied in conjunction with a ranking scheme to create a
set of participant clusters and corresponding message clusters. Having an explicit
mapping between a participant and message cluster allows the user to mold the
system to fit in with the legacy epistemology and to train it for further use. In addition
to this, the system can evolve with time and adapt to new semantic structures. Initial
results for the classification scheme are highly encouraging. Novel methods of
navigating through an email corpus are also explored. Latent semantic indexing and
other similarity measures are used as the basis for an interactive system that will
allow the user to extract underlying semantic structure from a corpus and capture it
for later use.
The emergence of low-cost fabrication technology (most notably 3D printing)
has brought us a dawn of making, promising to empower everyday users with the
ability to fabricate physical objects of their own design. However, the technology
itself is innately oblivious of the physical world—things are, in most cases, assumed
to be printed from scratch in isolation from the real world objects they will be
attached to and function with. To bridge this ‘gulf of fabrication', my thesis research
focuses on developing fabrication techniques with tool integration to enable users
expressively create designs that can be attached to and function with existing real
world objects. Specifically, my work explores techniques that leverage the 3D
printing process to create attachments directly over, onto and around existing objects;
a design tool further enables people to specify and generate adaptations that can be
attached to and mechanically transform existing objects in user-customized ways; a
mixed-initiative approach allows people to create functionally valid design, which
addresses real world relationships with other objects; finally, by situating the
fabrication environment in the real world, a suite of virtual tools would allow users to
design, make, assemble, install and test physical objects in situ directly within the
context of their usage. Overall my thesis attains to make fabrication real—innovation
in design tools harnesses fabrication technology, enabling things to be made by real
people, to address real usage and to function with real objects in the world.
Databases: Their Creation, Management and Utilization
Information systems are the software and hardware systems that support data-
intensive applications. The journal Information Systems publishes articles concerning
the design and implementation of languages, data models, process models,
algorithms, software and hardware for information systems. Subject areas include
data management issues as presented in the principal international database
conferences (e.g. ACM SIGMOD, ACM PODS, VLDB, ICDE and ICDT/EDBT) as
well as data-related issues from the fields of data mining, information retrieval,
internet and cloud data management, business process management, web semantics,
visual and audio information systems, scientific computing, and organizational
behaviour. Implementation papers having to do with massively parallel data
management, fault tolerance in practice, and special purpose hardware for data-
intensive systems are also welcome.
All papers should motivate the problems they address with compelling
examples from real or potential applications. Systems papers must be serious about
experimentation either on real systems or simulations based on traces from real
systems. Papers from industrial organisations are welcome.
Theoretical papers should have a clear motivation from applications. They
should either break significant new ground or unify and extend existing algorithms.
Such papers should clearly state which ideas have potentially wide applicability.
In addition to publishing submitted articles, the Editors-in-Chief will invite
retrospective articles that describe significant projects by the principal architects of
those projects. Authors of such articles should write in the first person, tracing the
social as well as technical history of their projects, describing the evolution of ideas,
mistakes made, and reality tests.
Technical results should be explained in a uniform notation with the emphasis
on clarity and on ideas that may have applications outside of the environment of that
research. Particularly complex details may be summarized with references to
previously published papers.
We will make every effort to allow authors the right to republish papers
appearing in Information Systems in their own books and monographs.
Dennis Shasha Gottfried Vossen
Hide full Aims & Scope
Today, creating an academic website goes hand-in-hand with creating your CV
and presenting who you are to your academic and professional peers. Creating and
maintaining your website is an essential tool in disseminating your research and
publications. Use your academic personal website to highlight your personality,
profile, research findings, publications, achievements, affiliations and more. In
addition, by using some of the many social media tools available, you can further
amplify the information contained in your website.
An academic personal website takes you a step further in terms of increasing
your visibility because it is an ideal place to showcase your complete research profile.
You will attract attention to your publications, your name recognition will increase
and you will get cited more. Moreover, a website is also useful for networking and
collaborating with others, as well as for job searching and application.
Online storage is an emerging method of data storage and back-up. A remote
server with a network connection and special software backs up files, folders, or the
entire contents of a hard drive. There are many companies that provide a web-based
One offsite technology in t h is area is loud computing. This allows colleagues
in an organization to share resources, software and information over the Internet.
Continuous backup and storage on a remote hard drive eliminates the risk of
data loss as a result of fire, flood or theft. Remote data storage and back-up providers
encrypt the data and set up password protection to ensure maximum security.
Small businesses and individuals choose to save data in a more traditional way.
External drives, disks and magnetic tapes are very popular data data storage
solutions. USB or flash methods are very practical with small volumes of data storage
and backup. However, they are not very reliable and do not protect the user in case of
Types of network
Following our meeting last week, please find my recommendations for your
business. I think you should set up a LAN, or Local Area Network, and a WAN, or
Wide Area Network, for your needs. A LAN connects devices over a small area, for
example your apartment and the shop. In addition, you should connect office
equipment, such as the printer, scanner and fax machine, to your LAN because you
can then share these devices between users. I'd recommend that we connect the LAN
to a WAN so you can link to the Internet and sell your products. In addition I'd
recommend we set up a Virtual Private Network so that you have a remote access to
your company's LAN, when you travel.
VPN is a private network that uses a public network, usually the Internet, to
connect remote sites or users together.
Let's meet on Friday to discuss these recommendations.
The Digital Divide
A recent survey has shown that the number of people in the United Kingdom
who do not intend to get internet access has risen. These people, who are known as
'net refuseniks', make up 44% of UK households, or 11.2 million people in total.
The research also showed that more than 70 percent of these people said that
they were not interested in getting connected to the internet. This number has risen
from just over 50% in 2005, with most giving lack of computer skills as a reason for
not getting internet access, though some also said it was because of the cost.
More and more people are getting broadband and high speed net is available
almost everywhere in the UK, but there are still a significant number of people who
refuse to take the first step.
The cost of getting online is going down and internet speeds are increasing, so
many see the main challenge to be explaining the relevance of the internet to this
group. This would encourage them to get connected before they are left too far
behind. The gap between those who have access to and use the internet is the digital
divide, and if the gap continues to widen, those without access will get left behind
and miss out on many opportunities, especially in their careers.
The First Computer Programmer
Ada Lovelace was the daughter of the poet Lord Byron. She was taught by
Mary Somerville, a well-known researcher and scientific author, who introduced her
to Charles Babbage in June 1833. Babbage was an English mathematician, who first
had the idea for a programmable computer.
In 1842 and 1843, Ada translated the work of an Italian mathematician, Luigi
Menabrea, on Babbage's Analytical Engine. Though mechanical, this machine was an
important step in the history of computers; it was the design of a mechanical general-
purpose computer. Babbage worked on it for many years until his death in 1871.
However, because of financial, political, and legal issues, the engine was never built.
The design of the machine was very modern; it anticipated the first completed
general-purpose computers by about 100 years.
When Ada translated the article, she added a set of notes which specified in
complete detail a method for calculating certain numbers with the Analytical Engine,
which have since been recognized by historians as the world's first computer
program. She also saw possibilities in it that Babbage hadn't: she realised that the
machine could compose pieces of music. The computer programming language 'Ada',
used in some aviation and military programs, is named after her.
Atom-sized transistor created by scientists
By David Derbyshire, Science Correspondent
the day of microscopic electronic devices that will revolutionise computing,
engineering and medicine.
Researchers at Cornell University, New York, and Harvard University, Boston,
fashioned the two "nano-transistors" from purpose-made molecules. When voltage
was applied, electrons flowed through a single atom in each molecule.
The ability to use individual atoms as components of electronic circuits marks
a key breakthrough in nano-technology, the creation of machines at the smallest
Prof Paul McEuen, a physicist at Cornell, who reports the breakthrough in
today's issue of Nature, said the single-atom transistor did not have all the functions
of a conventional transistor such as the ability to amplify.
But it had potential use as a chemical sensor to any change in its environment.
Basic principles of information security
Key concepts. For over twenty years, information security has held
confidentiality, integrity and availability (known as the CIA triad) to be the core
principles of information security. There is continuous debate about extending this
classic trio. Other principles such as Accountability have sometimes been proposed
for addition. It has been pointed out that issues such as Non-Repudiation1 do not fit
well within the three core concepts, and as regulation of computer systems has
increased (particularly amongst the Western nations) Legality is becoming a key
consideration for practical security installations in 1992. In 2002 the OECD's2
Guidelines for the Security of Information Systems and Networks proposed the nine
generally accepted principles: Awareness, Responsibility, Response, Ethics, 21
Democracy, Risk Assessment, Security Design and Implementation, Security
Management, and Reassessment. Based upon those, in 2004 the NIST's3 Engineering
Principles for Information Technology Security proposed 33 principles. From each of
these derived guidelines and practices in 2002, Donn Parker proposed an alternative
model for the classic CIA4 triad that he called the six atomic elements of information.
The elements are confidentiality, possession, integrity, authenticity, availability, and
Confidentiality. Confidentiality is the term used to prevent the disclosure of
information to unauthorized individuals or systems. For example, a credit card
transaction on the Internet requires the credit card number to be transmitted from the
buyer to the merchant and from the merchant to a transaction processing network.
The system attempts to enforce confidentiality by encrypting the card number during
transmission, by limiting the places where it might appear (in databases, log files5 ,
backups6 , printed receipts, and so on), and by restricting access to the places where
it is stored. If an unauthorized party obtains the card number in any way, a breach of
confidentiality has occurred. Breaches of confidentiality take many forms. Permitting
someone to look over your shoulder at your computer screen while you have
confidential data displayed on it could be a breach of confidentiality. If a laptop
computer containing sensitive information about a company's employees is stolen or
sold, it could result in a breach of confidentiality7 . Giving out confidential
information over the telephone is a breach of confidentiality if the caller is not
authorized to have the information. Confidentiality is necessary (but not sufficient)
for maintaining the privacy of the people whose personal information a system holds.
Integrity. In information security, integrity means that data cannot be modified
undetectably. This is not the same thing as referential integrity8 in databases,
although it can be viewed as a special case of Consistency as understood in the
classic ACID model of transaction processing. Integrity is violated when a message is
actively modified in transit. Information security systems typically provide message
integrity in addition to data confidentiality. Availability. For any information system
to serve its purpose, the information must be available when it is needed. This means
that the computing systems used to store and process the information, the security
controls used to protect it, and the communication channels used to access it must be
functioning correctly. High availability systems aim to remain available at all times,
preventing service disruptions due to power outages, hardware failures, and system
upgrades. Ensuring availability also involves preventing denial-of-service attacks9 .
Authenticity10 . In computing, e-business and information security it is necessary to
ensure that the data, transactions, communications or documents (electronic or
physical) are genuine. It is also important for authenticity to validate that both parties
involved are who they claim they are. 22 Non-repudiation. In law, non-repudiation
implies one's intention to fulfill their obligations to a contract. It also implies that one
party of a transaction cannot deny having received a transaction nor can the other
party deny having sent a transaction. Electronic commerce uses technology such as
digital signatures and public key encryption11 to establish authenticity and non-
Risk management is the process of identifying vulnerabilities1 and threats to
the information resources used by an organization in achieving business objectives,
and deciding what countermeasures, if any, to take in reducing risk to an acceptable
level, based on the value of the information resource to the organization.
There are two things in this definition that may need some clarification. First,
the process of risk management is an ongoing iterative2 process. It must be repeated
indefinitely. The business environment is constantly changing and new threats and
vulnerability emerge every day. Second, the choice of countermeasures (controls)
used to manage risks must strike a balance between productivity, cost, effectiveness
of the countermeasure, and the value of the informational asset being protected. Risk
is the likelihood that something bad will happen that causes harm to an informational
asset (or the loss of the asset). A vulnerability is a weakness that could be used to
endanger or cause harm to an informational asset. A threat is anything (man-made or
act of nature) that has the potential to cause harm.
The likelihood that a threat will use a vulnerability to cause harm creates a risk.
When a threat does use a vulnerability to inflict harm, it has an impact. In the context
of information security, the impact is a loss of availability, integrity, and
confidentiality, and possibly other losses (lost income, loss of life, loss of real
property). It should be pointed out that it is not possible to identify all risks, nor is it
possible to eliminate all risk. The remaining risk is called residual risk.
A risk assessment3 is carried out by a team of people who have knowledge of
specific areas of the business. Membership of the team may vary over time as
different parts of the business are assessed. The assessment may use a subjective
qualitative analysis based on informed opinion, or where reliable dollar figures and
historical information is available, the analysis may use quantitative analysis.
The research has shown that the most vulnerable point in most information
systems is the human user, operator, designer. The practice of information security
management recommends the following to be examined during a risk assessment:
security policy; organization of information security;
asset management4 ;
human resources security;
physical and environmental security;
communications and operations management;
information systems acquisition, development and maintenance; information
security incident management5 ;
business continuity management;
regulatory compliance6 .
In broad terms, the risk management process consists of:
1. Identification of assets and estimating their value. Include: people, buildings,
hardware, software, data (electronic, print, other), supplies.
2. Conduct a threat assessment. Include: acts of nature, acts of war, accidents,
malicious acts originating from inside or outside the organization.
3. Conduct a vulnerability assessment, and for each vulnerability, calculate the
probability that it will be exploited. Evaluate policies, procedures, standards, training,
physical security, quality control, technical security.
4. Calculate the impact that each threat would have on each asset. Use
qualitative analysis or quantitative analysis.
5. Identify, select and implement appropriate controls. Provide a proportional
response. Consider productivity, cost effectiveness, and value of the asset.
6. Evaluate the effectiveness of the control measures. Ensure the controls
provide the required cost-effective protection without discernible loss of productivity.
For any given risk, Executive Management can choose to accept the risk based
upon the relative low value of the asset, the relative low frequency of occurrence, and
the relative low impact on the business. Or, leadership may choose to mitigate the
risk by selecting and implementing appropriate control measures to reduce the risk. In
some cases, the risk can be transferred to another business by buying insurance or 24
out-sourcing7 to another business. The reality of some risks may be disputed. In such
cases leadership may choose to deny the risk. This is itself a potential risk.
When Management chooses to mitigate a risk, they will do so by implementing
one or more of three different types of controls.
Administrative. Administrative controls (also called procedural controls)
consist of approved written policies, procedures, standards and guidelines.
Administrative controls form the framework for running the business and managing
people. They inform people on how the business is to be run and how day to day
operations are to be conducted. Laws and regulations created by government bodies
are also a type of administrative control because they inform the business. Some
industry sectors have policies, procedures, standards and guidelines that must be
followed – the Payment Card Industry (PCI) Data Security Standard required by Visa
and Master Card is such an example. Other examples of administrative controls
include the corporate security policy, password policy, hiring policies, and
disciplinary policies. Administrative controls form the basis for the selection and
implementation of logical and physical controls. Logical and physical controls are
manifestations of administrative controls. Administrative controls are of paramount
Logical. Logical controls (also called technical controls) use software and data
to monitor and control access to information and computing systems. For example:
passwords, network and host8 based firewalls9 , network intrusion detection systems,
access control lists, and data encryption are logical controls. An important logical
control that is frequently overlooked is the principle of least privilege. The principle
of least privilege requires that an individual, program or system process is not granted
any more access privileges than are necessary to perform the task. A blatant example
of the failure to adhere to the principle of least privilege is logging into Windows as
user Administrator to read e-mail and surf the Web. Violations of this principle can
also occur when an individual collects additional access privileges over time. This
happens when employees' job duties change, or they are promoted to a new position,
or they transfer to another department. The access privileges required by their new
duties are frequently added onto their already existing access privileges which may
no longer be necessary or appropriate.
Physical. Physical controls monitor and control the environment of the work
place and computing facilities. They also monitor and control access to and from
such facilities. For example: doors, locks, heating and air conditioning, smoke and
fire alarms, fire suppression systems, cameras, barricades, fencing, security guards,
cable locks, etc. Separating the network and work place into functional areas are also
An important physical control that is frequently overlooked is the separation of
duties. Separation of duties ensures that an individual cannot complete a critical task
by himself. For example: an employee who submits a request for reimbursement10
should not also be able to authorize payment or print the check. An applications
programmer should not also be the server administrator or the database administrator
– these roles and responsibilities must be separated from one another. 25