Tag Archives: computing

Linux Foundation Security Checklist: Have It Your Way

Sysadmins should use the LF security checklist as a resource, said its creator, Konstantin Ryabitsev. “They can evaluate it, adapt it, hack on it until it fits their purpose, and hopefully contribute back via patches or feedback so others can, in turn, benefit from their work. That is one of the wonderful things about open source. If you create something, you share it and see where it goes.”

The Linux Foundation’s recently published security checklist may draw more attention to best practices for protecting Linux workstations, even if IT pros do not embrace all of its recommendations.

Konstantin Ryabitsev, the foundation’s director of collaborative IT services, developed the list for the use of LF remote sysadmins, to harden their laptops against attacks. However, the foundation has not asked for universal adoption.

The document covers a variety of situations, and it includes explanations about why certain measures are necessary and how best to implement them.

“Checklists and best practices documents are how Linux Foundation IT works internally. We are just taking an extra step of making generalized versions of these documents available to others under free documentation licenses, in hopes that they are useful to other teams. We have been doing this for months as part of our regular work,” Ryabitsev told LinuxInsider.

Critical Protections

The security checklist strikes a balance between security decisions and usability issues, according to Ryabitsev. It categorizes security according to four severity levels: critical, moderate, low and paranoid.

Critical recommendations consist of implementations that should be mandated: for instance, enabling SecureBoot to prevent rootkits or evil maid attacks, and choosing a Linux distribution that supports native full disk encryption.

Other factors deemed critical are using Linux products with timely security updates and cryptographic verification of packages. Also on the critical list are support for mandatory access control or role-based access control mechanisms like SELinux, AppArmor or Grsecurity.

More key critical guidelines include encrypting the swap partition, requiring a password to edit the bootloader, setting up a robust root password, and using an unprivileged account with a separate password for regular operations.

Further, using a password manager, choosing unique passwords for different websites, and protecting private keys with strong passphrases are considered critical.

From Moderate to Paranoid

The moderate and low severity guidelines offer substantial security value. Among them are running automatic operating system updates, disabling the SSH server on the workstation, storing authentication, having signing and encryption keys on smartcard devices, and putting PGP master keys on removable storage.

There are moderate and low severity guidelines for Web-surfing software, the Linux Foundation’s Ryabitsev noted.

For example, using two separate browsers is far from frivolous, he said, recommending Mozilla Firefox and Google Chrome.

The security angle focuses on which add-ons or extensions are paired with the browsers. For example, Firefox should have NoScript, Privacy Badger, HTTPS Everywhere and Certificate Patrol add-ons for work-related sites. Google Chrome should have Privacy Badger and HTTPS Everywhere installed.

The Linux Foundation’s recommendations labeled “paranoid” are for IT workers interested in implementing the ultimate security steps.

Guidelines for the paranoid IT worker include measures that have the potential for significant extra security benefits but that might take considerable effort to implement or understand. Two such items are running an intrusion-detection system, and using separate password managers for websites and other types of accounts.

Defining Terms

The security checklist from the Linux Foundation is a shining example of a measured security guideline, according to Patrick Morgan, senior software engineer at CabForward.com.

“The vast majority of the list details best practice and pragmatically addresses the security pitfalls of modern desktop computing. Few if any of the concepts and software technologies mentioned should be new to the intended audience,” he told LinuxInsider.

IT works in an age of economically motivated computer attacks and politically driven pervasive monitoring and compromised networks. That makes system administrators primary targets, Morgan said.

Balancing Act

A number of security problems have afflicted Linux systems recently, Ryabitsev noted.

“Systems administrators should approach this document just like they do all other open source resources,” he said.

“They can evaluate it, adapt it, hack on it until it fits their purpose, and hopefully contribute back via patches or feedback so others can, in turn, benefit from their work. That is one of the wonderful things about open source. If you create something, you share it and see where it goes,” Ryabitsev added.

“People are engaging with the document and sharing their feedback,” he pointed out. “We do believe that with many eyes all bugs are shallow, and the more people engage and learn from one another around security best practices, the better.”

Mission-Critical Machines

Sysadmins’ computers offer a gold mine of opportunities for hackers. Gaining access to emails, text files and notes, contact lists, encryption keys, and ephemeral browser sessions allows them to abuse end users, clients and employers, Morgan noted.

“That makes your machine more valuable than any other single target. It should be protected as such,” he said.

IT workers are more responsible than the average computer user. So if IT pros are not following these guidelines at a minimum, they are doing everyone they support — and the Internet in general — a disservice.

“Sloth and ignorance are not valid excuses,” said Morgan.

Reasonable Steps

Sysadmins definitely should put the checklist recommendations into full play, urged Tom O’Connor, lead product engineer for Linux solutions atRaytheon|Websense.

“The checklist admits to not being exhaustive and open to adaptation and tailoring,” he told LinuxInsider.

For an admin just starting a security project focusing on mobile Linux workstations, this checklist would be a great baseline. For an existing enterprise with mobile Linux assets and existing security practices and policies in place, it would be a good measuring stick for finding and addressing any gaps in coverage, O’Connor noted.

“Should this checklist be the only tool used for securing mobile Linux users? No. The Linux Foundation does not claim this checklist to be exhaustive. I found the checklist to be a perfectly reasonable set of steps to take in securing Linux mobile workstation environments,” he said.

Healthy Paranoia

In today’s hostile environment, IT workers can not be too paranoid about security issues, suggested Rob Kraus, director of security research and strategy at Solutionary.

“Linux or not, deep understanding of security is not the core competency of most system administrators I have encountered over the years. This is part of the reason we have focused on people who fill the security roles within organizations of all sizes,” he told LinuxInsider.

The primary rule of security is for IT to be paranoid, he said. “If you are not, then what are you doing in security, and how effective are you really being at protecting your organization today? In short, always leverage the tools that can make you successful. Checklists are nothing new and probably not used as much as they could be.”

Tagged , , , , ,

IBM to Buy Watson a Pair of Eyes

Watson, IBM’s three-million dollar baby, is about to get some peepers.

IBM last week announced a US$1 billion deal to acquire Merge Health, a provider of enterprise imaging and clinical system, with the goal of giving the supercomputer “eyes.”

Merge, which has operated in the health sector for two and a half decades, maintains a medical imaging management platform for archiving, accessing and sharing medical imagery.

The incorporation of Merge into Watson Health will give medical professionals the ability to leverage Watson for assistance with analyzing X-rays, MRIs, angiograms, electrocardiograms, and other medical images to spot anomalies and support their diagnoses, an IBM spokesperson said in a statement provided to TechNewsWorld by IBM Watson Health’s Christine Douglass.

“Increasingly, the universe of data in our daily lives is not just text — but many other forms like video, photos, voice and music — and medical imaging plays a critical role in diagnosis, treatment and medical monitoring,” the spokesperson said. “IBM has continued to push the boundaries of what Watson can do in healthcare, and vision is the next key area that we are pursuing.”

There are compelling synergies in the marriage of Merge’s medical imaging and IBM’s notable resources in analytics, cloud and cognitive computing, noted Kathleen Maher, vice president at Jon Peddie Research.

“There have been huge advances in medical technology,” she told TechNewsWorld, “but the providers tend to be segmented, as evidenced by the separation between radiologists and their peers in healthcare services. With the acquisition of Merge, IBM gains access to a huge trove of imaging data than can help Watson become smarter about imaging data.”

Speak Now or Forever Hold Your Peace

Merge has been a major player in the healthcare sector, but $1 billion is still a hefty price tag, noted Maher, adding that a review of the company’s finances indicates it has had difficulty in growing.

“A big challenge for the medical industry is the sluggish rate at which standards are developed and mandated by government agencies and industry,” she explained. “The length of time it is taking for medical data to be digitized, shared, standardized, etc., is challenging companies who want to move forward in this field.”

IBM’s cloud business revenues have been soaring, as both it and Watson evolve. Where small companies may struggle, IBM has been a force multiplier of sorts with its deep resources, observed Maher.

“IBM is a company that plays a long game,” she said. “It has demonstrated its commitment with a big check for Merge and also its active recruitment of industry partners.”

Becoming Human

IBM has been working for more than a decade on cognitive systems for understanding the complexities of images and videos, such as object identification and other things most people take for granted.

Watson became famous in 2011 when it took on — and beat — Jeopardy!champs Brad Rutter and Ken Jennings.

IBM since then has built an entire unit around Watson, and this spring began offering the might of the cloud-connected cognitive computer to the healthcare industry through the Watson Health service and the Watson Health Cloud platform.

“As the transformation of global healthcare continues,” IBM’s spokesperson said, “IBM Watson Health is focused on bringing systems of care, wellness, and support together with technology, data, and healthcare expertise to help people live healthier and more productive lives.”

Tagged , , , , ,

Apple Mac attacks ‘trivial’, claims security researcher

Creating malicious software that can attack Apple Mac computers is “trivial”, a leading security researcher has claimed.

Patrick Wardle, from security firm Synack, demonstrated several new types of malicious software that bypassed Apple’s security measures.

In one example, Apple’s own iCloud service could control an attack.

The threats are known to Apple, Mr Wardle said, but the company has not yet commented on the research.

Mr Wardle was speaking at Black Hat 2015, an annual gathering of hackers and security professionals held in Las Vegas.

He commended the company’s efforts in working with him to make the platform more secure, saying that the Cupertino-based firm “got security”.

But he argued that Apple’s increased popularity means it is attracting extra attention from cybercriminals who would commonly focus on attacking computers running Microsoft’s Windows.

While Window is still overwhelmingly attackers’ platform of choice, antivirus firm Kaspersky Labs recorded a surge in Apple malware in the past couple of years.

Bypassing

The past year has seen several high-profile malware – malicious software – attacks on the Mac operating system, OS X.

Among them, iWorm and WireLurker – the latter gaining a lot of media attention. However, Mr Wardle described such threats as “grade C+” due to a simple flaw: users could see if the malware was running, and disable it.

The attacks he detailed were far more hidden than anything that had been discovered so far.

“I’m convinced that OS X security is lacking,” he told delegates.

Patrick Wardle

“It’s trivial to write new OS X malware than can bypass everything.”

Addressing why he was making the vulnerabilities public, he said: “If I can do it, nation states and adversaries can and probably are doing it.”

Some elements of the vulnerabilities he and other researchers have discovered have been found “in the wild”, he said, the term given to threats being exploited on real users.

Trusted files

Mr Wardle’s research focused considerably on one piece of Apple software known as Gatekeeper. This is a program which warns the user when they are opening a file that is not from a “trusted” source.

laptop being examined

Its default setting is to only allow programs downloaded from Apple’s App Store and trusted third-party developers.

This means, according to Apple’s website: “If an app was developed by an unknown developer — one with no Developer ID — Gatekeeper can keep your Mac safe by blocking the app from being installed.”

But his methods demonstrated a method of circumventing this protection, using “dynamic libraries” to inject malicious code into trusted programs.

iCloud malware

Mr Wardle had strong criticisms of Apple’s built in antivirus program, XProtect. The software, which detects and blocks known malware, warning the user in the process, could be tricked by essentially renaming the malware.

The researcher also tested various different paid antivirus products on the market, and concluded that they suffer similar problems as XProtect.

In one case, he noted that some antivirus programs consider Apple’s iCloud system – the online storage service it offers all users of its products – to be a “trusted” source.

This means Mr Wardle was able to use iCloud to host an attack’s Command and Control server, the part of an attack that controls the malware’s operation. Implicit trust of iCloud servers is a problem.

“Normally malware (on a user’s computer) would have its outgoing network connections blocked since they are untrusted,” explained Mr Wardle to the BBC.

“But if they go to iCloud, the security products let them out.”

Apple love

Mr Wardle noted that Apple has been receptive to his research in the past, but that the methods he described were still vulnerable.

He has created free software – called Objective-See – to address the issues he outlined.

Patrick Wardle

A request from the BBC to Apple has gone unanswered at the time of publication.

Mr Wardle said: “I’ve shared this will Apple, and they have patched or fixed some of the bugs.

“The problem is, in some cases their patches are insufficient, so I can bypass the patch.

“I always (first) share my research with Apple and only disclose details once they have released a patch.”

Concluding his talk, Mr Wardle said his work was motivated by a love for Apple and its products.

“I don’t think they love me,” he added. “But I can handle that.”

Tagged , , , , , , ,

What is Moore’s Law?

If you’ve been around the internet for longer than Jayden Smith, you’re probably familiar with Moore’s Law. It’s often misquoted, often misunderstood, but its “law” status is rarely questioned. The most general possible way to state Moore’s Law is this: computing power tends to approximately double every two years. It gained notoriety because people like laws that let them predict the future of one of the world’s biggest industries, but the very physical basis for this principle means it is slightly different — and less reliable — than many people believe.

Though he did not give it that name, Moore’s Law was first proposed in a magazine article by Intel co-founder Gordon E. Moore. What it actually says is that the number of transistors that can be packed into a given unit of space will roughly double every two years. That prediction has remained impressively true, a fact that’s allowed everything from pocket-sized smartphones to Crysis 3, and the continuing computerization of the economy.

MooresLaw2

Yet, stated as a precaution about human abilities in physical manufacturing, and divorced from rather airy ideas like “computing power,” it becomes clear why Moore’s Law won’t necessarily always hold true. Remember that when Moore made his original prediction, he predicted a doubling every year, but he quickly amended this to every two years. Physical limitations on the manufacturing of these chips could easily push that number back to five years or more, effectively invalidating Moore’s Law forever, and revealing it to be nothing more than Moore’s Very Good But Ultimately Limited Prediction (MVGBULP).

moore 2

Today, all consumer processors are made out of silicon — the second most abundant element in the Earth’s crust, after oxygen. But silicon is not a perfect conductor, and limits to the mobility of the electrons it carries impose a hard limit on how densely you can pack silicon transistors. Not only does power consumption come a huge issue, but an effect called quantum tunneling can cause problems for keeping electrons contained beyond a certain thickness threshold.

Outside of research facilities, silicon transistors don’t currently get smaller than 14 nanometers — and while some 10 nanometer chips designs might someday reach the market, it’s seen as a foregone conclusion that to keep to Moore’s Law over a long period of time, we’ll have to come up with newer and better materials to be the basis of next generation computers.

One oft-cited example is graphene, or the rolled up tubes of graphene called carbon nanotubes. Graphene is “atomically thin,” often called two-dimensional, and so it allows a huge increase on the physical side of things. On the other hand, graphene does not have a useful bandgap — the energy difference we need to navigate to bump electrons back and forth between the conducting and non-conducting bands. That’s how silicon transistors switch on and off, which is the entire basis for their method of computation.

If this problem can’t be offset in some way, a graphene computer would have to pioneer a whole new logical method for computing. One graphene computer chip from IBM proved to be incredibly fast, 10,000 times faster than a silicon chip — but it was not a general-purpose processor. Since graphene can’t be easily switched on and off in mass quantities, we can’t simply swap in graphene for silicon and keep on with modern chip architectures.

Sebastian Anthony holding a wafer of graphene chips at IBM Research

Other materials may offer more practical reductions in size and electrical resistance, and actually allow Moore’s Law to continue unbroken, but only if they hit the market quickly enough. Silicon-germanium, or just germanium alone, have been talked about for some time, but have yet to really materialize in any sort of affordable form. It was recently discovered that a material called titanium tri-sulfide can provide many of the same physical advantages as graphene, and do so with an achievable bandgap — such a super-material might be what’s needed, but graphene-like problems with manufacturing then rear their ugly heads.

Quantum computing could be another answer, but research is still so preliminary that it’s doubtful. Some believe they’ll offer such a huge and immediate upgrade over modern processors that computer encryption will come tumbling down. However, quantum computing won’t necessarily come in the form of a programmable digital computer right away; early quantum computers won’t be able to run Windows, even if they are more than fast enough in a theoretical sense. Of all the possible “solutions” to looming problems with Moore’s Law, quantum computing is probably the least realistic. It has a lot of potential for specific applications, but quantum PCs are still too far out to be worth considering.

Moore himself admitted that his Law “can’t continue forever” in a 2005 interview. It’s the nature of exponential functions, he said — they eventually hit a wall, and while that makes perfect sense in the purely hypothetical world of mathematics, it tends not to work out as well in the real world. It could be that Moore’s Law will hold up when viewed on the century scale, zoomed out to diminish the importance of any small fluctuations between new technologies. But the fact remains that right now, we’re entering a lull as we wait for the next great processing tech to arrive.

Tagged , , , , , ,

Rare Alan Turing journal shows his genius at work

CNET gets a deeper look at the hidden notebook from the “father of modern computing,” before it goes up for auction.

In a beige, nondescript building on the edge of San Francisco’s industrial district, rare-books expert Cassandra Hatton opens a black, book-shaped case with gold lettering etched on its cover and spine. Inside is an ordinary-looking journal containing the mathematical musings of a man with extraordinary talents: mathematician and war hero Alan Turing.

Turing’s influence permeates the computer industry. He was the first to devise the notion of step-by-step instructions, or algorithms, for performing calculations. His so-called Turing machine concept became the basis of the digital computer. Now the public is getting a glimpse of his mathematical brilliance at work.

“[Turing’s work] has an impact on every single person, at least in the Western world,” said Hatton, a senior specialist in Fine Books & Manuscripts and Space History at auction house Bonhams. “The way the world functions now would not exist without him.”

Hatton has handled the first editions of Galileo’s “Dialogo” and Newton’s “Principia” as well as manuscripts written by Albert Einstein and Richard Feynman. Yet she counts Turing’s journal, surfacing publicly for the first time since his death 61 years ago, as one of the top scientific documents she’s held. That’s because it’s the most extensive example of Turing’s handwritten notations — covering 56 pages — ever seen. Bonhams, which will put the document up for auction in April, expects it to fetch “at least seven figures,” a portion of which will go to charity.

“He was not known for handwriting things,” said Hatton. “It was either in his head or he would type it out. In all likelihood, this is the only manuscript like this.”

Code breaking by day, math by night

Hatton removed the multicolored notebook from its case, placed it on a table and then opened it to show the pages filled with line after line of math theory and equations, in Turing’s precise handwriting.

She’s concluded that he wrote his notes sometime between 1940 and 1942, while Turing was playing a pivotal role at Bletchley Park cracking intercepted coded messages from the Nazis.

Essentially, he was cracking codes during the day and working on mathematics at night.

Hatton has spent a year researching the different mathematical theories Turing was working on. She compared Turing’s math work to figuring out a mistake in Sudoku, a Japanese puzzle game where the player inserts numbers to complete a math equation.

“You can’t guess. If you make a mistake, you’re 20 moves along and you don’t know where you went wrong,” she said. “He’s looking at how his predecessors are looking at mathematical notations…he’s trying to see where they went wrong so he can make it right.”

The manuscript is all business, but there are some moments when his personality comes through. “Hateful!” reads a tiny note over an equation.

Turing wrote about a mathematical theory on one side of the notebook; then flipped the book over to write from the other end about mathematical notations. In the middle of the notebook is a dream journal written by Robin Gandy, Turing’s colleague and heir to all of Turing’s papers. Unlike Turing’s other papers, which Gandy archived, the journal was locked away and remained undiscovered until Gandy’s death in 1995.

Turing’s notes give a glimpse at the roots of computer science, according to Vijay Pande, a Stanford University professor of computer science who examined a few pages.

“It’s clear that fundamental logic is at the heart of computer science and everything we do — and in that sense it’s clear the whole field owes Turing so very much,” he wrote in an email. “But in a sense, it also shows how far we’ve come.”

It’s hard to overstate Turing’s impact on our world. His concept of the Turing machine led to the development of digital computers. “I propose to consider the question, ‘Can machines think?'” he wrote in a paper in 1950. That proposal led to the Turing Test, in which a person tries to distinguish whether she’s interacting with another human or with a computer. The notion has become the basis of artificial intelligence, the technology underlying digital assistants like Siri, Cortana or Google Now.

“I think the fact that we all carry Turing machines around in our pockets and still argue about the Turing test 70-plus years later says it all,” Marc Andreessen, partner of venture capital firm Andreessen Horowitz, said about Turing’s influence.

From rotors to hashtags

The Oscar-nominated “The Imitation Game,” based primarily on Turing’s work at Bletchley Park, is just finishing its run in theaters.

In it, actor Benedict Cumberbatch portrays Turing standing in front of a giant machine made of wires and rotors clicking into position. It’s an exaggerated depiction of Turing’s Bombe Machine, used to decode the Nazi’s Enigma machine messages during World War II. Historians estimate that the work of Turing and his team shortened the war by two to four years.

At the movie’s end, a caption tells us Turing’s invention would later evolve into modern-day computers.

“That was really cool, but how does that become what we use now?,” my 24-year-old cousin asked after watching the movie last week. “Like, how do I hashtag with that?”

In truth, Turing first devised his concept for what would become a digital computer in the 1930s. Long before the Bombe, which was designed to do only one thing: search for possible settings needed to decrypt messages. Turing had conceived the idea of a machine that could store both data and instructions in memory.

“It’s no exaggeration to say he’s a founding father of every computer and Internet company today,” Google Engineering Director Andrew Eland wrote in 2012 in Google’s “tribute to Alan Turing, the father of modern computing.”

“While his wartime code breaking saved thousands of lives, his own life was destroyed when he was convicted for homosexuality. But the tragedy of his story should not overshadow his legacy,” Eland wrote.

In its own way, that tragedy has just as big an impact on our modern society as Turing’s scientific achievements, said Hatton. Though a brilliant mathematician and war hero, he was considered a criminal at a time when acts of homosexuality were outlawed. In 1952 he was convicted of gross indecency and chemically castrated as part of his sentence. As a result of the treatment, he committed suicide in 1954.

“It’s also a very concrete symbol of what the world has lost because of homosexual persecution,” Hatton said. “Who knows what else he might have done? Or other people who were persecuted? Who knows what else the world has lost?”

Tagged ,