Sunday, December 15, 2013

Scientific Computing: Scientific Research as a Precision Tool

The creation of a complex system that utilizes real-world data to solve problems often requires a high degree of precision. This level of precision is often not possible without the use of scientific research. As someone with experience working with artificial intelligence and assembly-line type automation, I can confirm that prior scientific research is important for analytical systems in a factory setting (especially those working with organic objects).

For example, if a food company wants to design a system for checking the quality of potato chips, data would need to be collected on the chips themselves before a solution could be developed. The first step would be to define what quality of product is acceptable for shipment. Next, what are the properties of such a chip? What is its shape, size, color, pH level, trans fat content, etc.? After the acceptable quality is determined, it is necessary to find the properties of what is not an acceptable quality. These quality levels can be determined by scientific tests and data analysis to determine what properties present in the product make it tasty, unhealthy, visually appetizing, etc. Then, using this data, thresholds can be created that define exactly what makes a "good" chip and what makes a "bad" one. From there, it is possible to create algorithms that analyze the product using a variety of sensors and dispose of defective chips.

http://mentalfloss.com/sites/default/files/styles/article_640x430/public/green-chip_5.jpg
Green potato chips are sometimes considered undesirable by consumers
During my internship, I successfully assisted my mentor in the application of using scientific research and analysis to flower bulbs (see previous post on A.I.). In organic materials, properties of objects vary greatly depending on the genetic makeup of the object. It is very important to properly prepare for this diversity when utilizing computers to solve problems!

Sunday, December 8, 2013

Computer Graphics: How a Teapot "Shaped" an Industry

In the 1970s, 3D computer generated imagery was a new concept being actively researched. At that time, very few computers had enough power to generate 3D images, and even if they did, special hardware (e.g. a graphics terminal) was required to view the resulted rendering. During this time, the University of Utah was a big name in computer graphics research.One of the university's most well known researchers in the area of computer graphics was Martin Newell. In 1975, he needed a recognizable shape that could be used as a benchmark to test the 3D Rendering system that he was developing. One day, Newell was drinking tea with his wife and noticed that the teapot's shape had a very diverse range of mathematical properties, making it the perfect object to model his benchmark after.
Martin Newell's Teapot Sketch.

File:Utah teapot simple 2.png
Modern Rendering


Newell then sketched the teapot onto graph paper and put the coordinates into a computer. The result was an endearing image of a teapot that left a big impact on computer graphics. For decades after, the Utah Teapot <continued to be used as a benchmark> for graphics capabilities These days, rendering the teapot in films and other works is an inside joke among graphics artists. The original teapot is an exhibit at the Computer History Museum in Mountain View. I was privileged enough to see it there recently!
The teapot makes a cameo in Toy Story
The Utah Teapot on display

Sunday, December 1, 2013

Communications and Security: Are Your Tools Working Against You?

Thanks to Edward Snowden, Julian Assange, the NSA and others, information security has become a critical concern for many people across the globe. Computer users everywhere are concerned that they may be under the surveillance of some kind of government entity. Many experts say that encrypting data and using strong passwords can help a lot, but is it enough? Today, I'd like to talk about the frightening possibility that protection against surveillance may be in a place that is unreachable to most people: compilers.

In 1987, Ken Thompson, the co-creator of Unix, gave a very interesting lecture information security as part of his Turing Award acceptance speech. The speech, titled Reflections on Trusting Trust gave an extremely alarming insight on just how difficult it is to keep data secure. In this talk, Thompson described how he was able to modify the source code of his C compiler to “deliberately mis-compile source whenever a particular pattern is matched”. The pattern that he used as a proof of concept was the login command for Unix. This command is absolutely critical to any Unix system, as it protects users' private data from other users via a password. Thompson was able to generate a C compiler that compiled login in special way, giving the command a bug that would allow anyone super user privileges if they were to type a very specific password that he came up with. This C compiler would compile any other code normally. Thompson went on to discuss the all too possible situation that a person with enough knowledge and experience could plant exploits like this into assembly and machine code, making it almost undetectable. “You can't trust code that you did not totally create yourself.”, Thompson concluded.
[figure 7]
Proof of concept code by Thompson
Ken Thompson

















With this horrifying revelation, it is very clear to that it is not possible to be completely secure. We can make our passwords as strong as possible, but are those passwords effective if we can't even trust our own tools? I will still do my best in regards to conventional security practices, but I acknowledge that they cannot protect me against a skilled enough attacker.

I leave you with a quote:

"The only system which is truly secure is one which is switched off and unplugged locked in a titanium lined safe, buried in a concrete bunker, and is surrounded by nerve gas and very highly paid armed guards. Even then, I wouldn't stake my life on it."
-- Gene Spafford, Director, Computer Operations, Audit, and Security Technology (COAST) Project, Purdue University

Sunday, November 24, 2013

Artificial Intelligence: A Close Encounter

When the average person talks about artificial intelligence, they frequently recall 2001: A Space Odyssey, Terminator and a myriad of other movies, T.V. shows and games in which computers go against their human masters. However, what most people don't realize is that while computers aren't capable of becoming sentient and over-throwing their owners, artificial intelligence has developed to the point of becoming an effective tool for solving complex problems. I have been lucky enough to witness the power of artificial intelligence first hand!

Between June 2011 and January 2013, I worked on and off at a small start up company in my hometown. Until fairly recently, I was unable to talk about the work I did there due to a non-disclosure agreement, but now that my superior has received a patent for his good work, I am able to discuss it somewhat.

During my 1.5 year internship with Cognisense Labs, I helped improve my mentor's invention, an autonomous system for planting flower bulbs. This system consisted of an industrial robotic arm, an image capture device and a powerful computer for running my mentor's (now patented) image analysis algorithms. This system was used to locate, orient and plant flower bulbs into crates before being taken to a storage facility.



Robots and artificial intelligence are key pieces to solving the problem of eliminating the need for humans to perform dull, repetitive and sometimes dangerous tasks. Prior to working on his robotic farming endeavors, my mentor invented a driver-less mine sweeping vehicle to clean up former war-zones safely, eliminating the need for innocent people to put themselves in harms way.



Some people see artificial intelligence as death and destruction, but I see the opposite. Artificial intelligence is a catalyst for making the world a better place!

Sunday, November 17, 2013

A Brief History of the Unix Operating System

Some people may consider operating systems such as MSDOS and OS/2 to be quite old, but did you know that there is a series operating systems that has maintained popularity for over 40 years? The Unix family (along with its lengthy list of clones), despite undergoing many iterations and revisions, has remained a favorite among users and developers alike. Even those who have never heard of it may find that they are, in fact, using Unix under a different name! 

A "family photo" of Unix and its derivatives


The history of Unix began at Bell Labs in the mid 1960's. In a joint project with M.I.T, Bell Labs was developing a new time sharing operating system called the Multiplexed Information and Computing Service or MULTICS for short.[1] On the MULTICS project were two bright, young and ambitious engineers named Dennis Ritchie and Ken Thompson. Despite Ritchie and Thompson being developers of MULTICS, they did not agree with the design principles behind it. MULTICS was large and over complicated. Frustrated with the system's design flaws, Ritchie and Thompson began to write their own operating system on an old, unused PDP-7. They called their system UNICS, which stood for UNIplexed Information and Computing Service. This was a playful jab at the unnecessary complexity of MULTICS (Unixplex being a nonsense word).[2] The original UNICS was finished in 1969 and the name was changed to Unix TSS (Time Sharing System) shortly after. 

Dennis Ritchie and Ken Thompson programming on a tele-type console in the 1970's


The new system was popular within Bell Labs, but it did not become popular until 1973 when Dennis Ritchie re-wrote it in the C Language. This adjustment threw the fledgling operating system into the spotlight and changed the face of operating design forever. Being re-written in C made Unix the worlds first portable operating system, which meant it was the first operating system that could run on different types of computers.

The simple, elegant design of Unix made it enjoyable to use and its portability made it cheap and easy to obtain. These qualities also made Unix an ideal system to draw inspiration from and today there are many systems that are considered to be derivatives.

From super computers to the smartphone in your pocket, Unix is everywhere and it is likely to continue to be everywhere for many years to come!


Android, Linux and Mac OSX are all derivatives of Unix

http://web.mit.edu/multics-history/ [1]
http://www.unix.org/what_is_unix/history_timeline.html [2] [3]


Sunday, November 10, 2013

File Sharing: A Brief History

In the 1970's and most of the 1980's, the Internet had a very limited audience, but file sharing was being actively partaken in. The participants, however, were not average people looking to make copies of movies and music. Instead, the primary demographic of file sharers were curious children and adults looking to learn more about the world of computers. Type-in-programs like these were often distributed with computer hobbyist magazines in hopes of perforating computer literacy and inspiring users to become programmers.

File sharing quickly graduated to more complex means of distribution, the immediate next step being downloads through bulletin board systems. In the early days of public access to the Internet, bulletin board systems (BBSs) were like a primitive forum/chat room where people could go to talk, play games and of course, share files. However, in times where download speeds were measured in “bauds per second”, file sharing over the Internet wasn't terribly practical yet.


The most recent major innovation in file sharing has of course been peer-to-peer distributed file sharing systems. Napster Is widely credited to be the first peer-to-peer file sharing network. Due to legal issues, Napster was temporarily shut down and then re-opened as a paid service.