Sunday, December 15, 2013

Scientific Computing: Scientific Research as a Precision Tool

The creation of a complex system that utilizes real-world data to solve problems often requires a high degree of precision. This level of precision is often not possible without the use of scientific research. As someone with experience working with artificial intelligence and assembly-line type automation, I can confirm that prior scientific research is important for analytical systems in a factory setting (especially those working with organic objects).

For example, if a food company wants to design a system for checking the quality of potato chips, data would need to be collected on the chips themselves before a solution could be developed. The first step would be to define what quality of product is acceptable for shipment. Next, what are the properties of such a chip? What is its shape, size, color, pH level, trans fat content, etc.? After the acceptable quality is determined, it is necessary to find the properties of what is not an acceptable quality. These quality levels can be determined by scientific tests and data analysis to determine what properties present in the product make it tasty, unhealthy, visually appetizing, etc. Then, using this data, thresholds can be created that define exactly what makes a "good" chip and what makes a "bad" one. From there, it is possible to create algorithms that analyze the product using a variety of sensors and dispose of defective chips.

http://mentalfloss.com/sites/default/files/styles/article_640x430/public/green-chip_5.jpg
Green potato chips are sometimes considered undesirable by consumers
During my internship, I successfully assisted my mentor in the application of using scientific research and analysis to flower bulbs (see previous post on A.I.). In organic materials, properties of objects vary greatly depending on the genetic makeup of the object. It is very important to properly prepare for this diversity when utilizing computers to solve problems!

Sunday, December 8, 2013

Computer Graphics: How a Teapot "Shaped" an Industry

In the 1970s, 3D computer generated imagery was a new concept being actively researched. At that time, very few computers had enough power to generate 3D images, and even if they did, special hardware (e.g. a graphics terminal) was required to view the resulted rendering. During this time, the University of Utah was a big name in computer graphics research.One of the university's most well known researchers in the area of computer graphics was Martin Newell. In 1975, he needed a recognizable shape that could be used as a benchmark to test the 3D Rendering system that he was developing. One day, Newell was drinking tea with his wife and noticed that the teapot's shape had a very diverse range of mathematical properties, making it the perfect object to model his benchmark after.
Martin Newell's Teapot Sketch.

File:Utah teapot simple 2.png
Modern Rendering


Newell then sketched the teapot onto graph paper and put the coordinates into a computer. The result was an endearing image of a teapot that left a big impact on computer graphics. For decades after, the Utah Teapot <continued to be used as a benchmark> for graphics capabilities These days, rendering the teapot in films and other works is an inside joke among graphics artists. The original teapot is an exhibit at the Computer History Museum in Mountain View. I was privileged enough to see it there recently!
The teapot makes a cameo in Toy Story
The Utah Teapot on display

Sunday, December 1, 2013

Communications and Security: Are Your Tools Working Against You?

Thanks to Edward Snowden, Julian Assange, the NSA and others, information security has become a critical concern for many people across the globe. Computer users everywhere are concerned that they may be under the surveillance of some kind of government entity. Many experts say that encrypting data and using strong passwords can help a lot, but is it enough? Today, I'd like to talk about the frightening possibility that protection against surveillance may be in a place that is unreachable to most people: compilers.

In 1987, Ken Thompson, the co-creator of Unix, gave a very interesting lecture information security as part of his Turing Award acceptance speech. The speech, titled Reflections on Trusting Trust gave an extremely alarming insight on just how difficult it is to keep data secure. In this talk, Thompson described how he was able to modify the source code of his C compiler to “deliberately mis-compile source whenever a particular pattern is matched”. The pattern that he used as a proof of concept was the login command for Unix. This command is absolutely critical to any Unix system, as it protects users' private data from other users via a password. Thompson was able to generate a C compiler that compiled login in special way, giving the command a bug that would allow anyone super user privileges if they were to type a very specific password that he came up with. This C compiler would compile any other code normally. Thompson went on to discuss the all too possible situation that a person with enough knowledge and experience could plant exploits like this into assembly and machine code, making it almost undetectable. “You can't trust code that you did not totally create yourself.”, Thompson concluded.
[figure 7]
Proof of concept code by Thompson
Ken Thompson

















With this horrifying revelation, it is very clear to that it is not possible to be completely secure. We can make our passwords as strong as possible, but are those passwords effective if we can't even trust our own tools? I will still do my best in regards to conventional security practices, but I acknowledge that they cannot protect me against a skilled enough attacker.

I leave you with a quote:

"The only system which is truly secure is one which is switched off and unplugged locked in a titanium lined safe, buried in a concrete bunker, and is surrounded by nerve gas and very highly paid armed guards. Even then, I wouldn't stake my life on it."
-- Gene Spafford, Director, Computer Operations, Audit, and Security Technology (COAST) Project, Purdue University

Sunday, November 24, 2013

Artificial Intelligence: A Close Encounter

When the average person talks about artificial intelligence, they frequently recall 2001: A Space Odyssey, Terminator and a myriad of other movies, T.V. shows and games in which computers go against their human masters. However, what most people don't realize is that while computers aren't capable of becoming sentient and over-throwing their owners, artificial intelligence has developed to the point of becoming an effective tool for solving complex problems. I have been lucky enough to witness the power of artificial intelligence first hand!

Between June 2011 and January 2013, I worked on and off at a small start up company in my hometown. Until fairly recently, I was unable to talk about the work I did there due to a non-disclosure agreement, but now that my superior has received a patent for his good work, I am able to discuss it somewhat.

During my 1.5 year internship with Cognisense Labs, I helped improve my mentor's invention, an autonomous system for planting flower bulbs. This system consisted of an industrial robotic arm, an image capture device and a powerful computer for running my mentor's (now patented) image analysis algorithms. This system was used to locate, orient and plant flower bulbs into crates before being taken to a storage facility.



Robots and artificial intelligence are key pieces to solving the problem of eliminating the need for humans to perform dull, repetitive and sometimes dangerous tasks. Prior to working on his robotic farming endeavors, my mentor invented a driver-less mine sweeping vehicle to clean up former war-zones safely, eliminating the need for innocent people to put themselves in harms way.



Some people see artificial intelligence as death and destruction, but I see the opposite. Artificial intelligence is a catalyst for making the world a better place!

Sunday, November 17, 2013

A Brief History of the Unix Operating System

Some people may consider operating systems such as MSDOS and OS/2 to be quite old, but did you know that there is a series operating systems that has maintained popularity for over 40 years? The Unix family (along with its lengthy list of clones), despite undergoing many iterations and revisions, has remained a favorite among users and developers alike. Even those who have never heard of it may find that they are, in fact, using Unix under a different name! 

A "family photo" of Unix and its derivatives


The history of Unix began at Bell Labs in the mid 1960's. In a joint project with M.I.T, Bell Labs was developing a new time sharing operating system called the Multiplexed Information and Computing Service or MULTICS for short.[1] On the MULTICS project were two bright, young and ambitious engineers named Dennis Ritchie and Ken Thompson. Despite Ritchie and Thompson being developers of MULTICS, they did not agree with the design principles behind it. MULTICS was large and over complicated. Frustrated with the system's design flaws, Ritchie and Thompson began to write their own operating system on an old, unused PDP-7. They called their system UNICS, which stood for UNIplexed Information and Computing Service. This was a playful jab at the unnecessary complexity of MULTICS (Unixplex being a nonsense word).[2] The original UNICS was finished in 1969 and the name was changed to Unix TSS (Time Sharing System) shortly after. 

Dennis Ritchie and Ken Thompson programming on a tele-type console in the 1970's


The new system was popular within Bell Labs, but it did not become popular until 1973 when Dennis Ritchie re-wrote it in the C Language. This adjustment threw the fledgling operating system into the spotlight and changed the face of operating design forever. Being re-written in C made Unix the worlds first portable operating system, which meant it was the first operating system that could run on different types of computers.

The simple, elegant design of Unix made it enjoyable to use and its portability made it cheap and easy to obtain. These qualities also made Unix an ideal system to draw inspiration from and today there are many systems that are considered to be derivatives.

From super computers to the smartphone in your pocket, Unix is everywhere and it is likely to continue to be everywhere for many years to come!


Android, Linux and Mac OSX are all derivatives of Unix

http://web.mit.edu/multics-history/ [1]
http://www.unix.org/what_is_unix/history_timeline.html [2] [3]


Sunday, November 10, 2013

File Sharing: A Brief History

In the 1970's and most of the 1980's, the Internet had a very limited audience, but file sharing was being actively partaken in. The participants, however, were not average people looking to make copies of movies and music. Instead, the primary demographic of file sharers were curious children and adults looking to learn more about the world of computers. Type-in-programs like these were often distributed with computer hobbyist magazines in hopes of perforating computer literacy and inspiring users to become programmers.

File sharing quickly graduated to more complex means of distribution, the immediate next step being downloads through bulletin board systems. In the early days of public access to the Internet, bulletin board systems (BBSs) were like a primitive forum/chat room where people could go to talk, play games and of course, share files. However, in times where download speeds were measured in “bauds per second”, file sharing over the Internet wasn't terribly practical yet.


The most recent major innovation in file sharing has of course been peer-to-peer distributed file sharing systems. Napster Is widely credited to be the first peer-to-peer file sharing network. Due to legal issues, Napster was temporarily shut down and then re-opened as a paid service.

Sunday, October 13, 2013

OPEN SOURCE: What it is and what it isn't


This is the most difficult blog post I have had to write to date. While some people only know “open source” as a buzzword they heard on the street or in a forum, the ideology behind this term has had an extremely profound effect on my life, especially in the context of how I develop software. To avoid posting a giant wall of text, I will limit my post to dispelling some myths and explaining some concepts about open source software (OSS) that the public sometimes has difficulty grasping.

Here are some myths:


1) You cannot make money off OSS/ OSS is equivelant to freeware

While it is true that some OSS does not require purchase, there are many businesses out there who make money off of open source software. Even some pieces of OSS that appear to be “free” are actually making money for the corporation producing it. For example, some corporations charge money for the support of their software.


2) OSS is of low quality

While there will always be poorly written programs, the common generalization regarding the supposed low quality of OSS is false. I often encounter this stereotype with the related “open source software is disorganized” and “anyone can just submit code to an OSS project” myths. There is some kind of basic hierarchy in all good OSS projects, just as there is in proprietary software projects. OSS projects have maintainers and they (along with other contributors) look over code submitted to the project before actually adding it to the program that is available for public download. Open source software development is not a chaotic influx of random contributions. It is more like a peer-review system for programmers. 


3) OSS is not safe because criminals can see the code and harm users by exploiting it.

I always found this stereotype amusing, because this is the opposite of what actually happens to OSS projects. Because the code is visible to everyone, OSS code gets examined critically by a greater number of developers in comparison to proprietary software projects. Because the source gets looked over so much, people find bugs very quickly. When a problem is found, it is usually fixed very quickly!

This post is already getting really long, so I'll wrap it up by writing a (very) brief summary on what Open Source is about. Open Source is an ecosystem of development where programmers and corporations help other people by helping themselves. Let's say that Joe needs to solve a small problem. He writes a small program to solve this problem. After he finishes, Joe reads about other people experiencing a similar problem, so he decides to post his program to the internet with it's source code under an open source software license. Because Joe shared his code, other people can make improvements to that code and use it for themselves. Joe has just shown a great kindness to some people in need with a piece of code that he may have otherwise abandoned after solving his problem. Everyone wins!

AGILE: Tools for Producing Quality Software



In the world of software development methodologies, there is definitely more than one way to skin the proverbial cat. One of these (or rather, a group of them) is known as Agile software development. Agile is in fact an entire category of software development methodologies that are loosely connected by common goals detailed in the Agile Manifesto.

The basic idea behind Agile is that software development should be “people friendly” on all levels. Agile strives to bridge gaps in communication between software developers, managers and customers. It accomplishes this by encouraging developers to release code frequently and get feedback from the customer as frequently as possible.

Like any broad category of methodologies, some implementations of Agile are better than others. I am personally a fan of Story-driven modeling, Test-driven development and Pair Programming. Although I did not know what Agile was until recently, I have used these methodologies multiple times in the past, having had no idea that they were part of Agile.

In conclusion, I think Agile has been a valuable contribution to software development. It is important to have some kind of plan when building a large project of any kind and software is no exception. It's good to know that there are so many methodologies available to make sure future projects will be more likely to succeed!

Friday, September 20, 2013

LinkedIn and Branding: What we have forgotten


When critically acclaimed self-help guru and philosopher Napoleon Hill wrote Think and Grow Rich in 1937, there was no such thing as social networking. The buzzword “personal branding” also did not exist yet, however Napoleon Hill was the first person to publish a written work that expressed the basic principle behind it. For those of you who don't know, Napoleon Hill was born during the turn of the century in 1883 and he is famous for his philosophies regarding success and positive attitude. You see, Hill was a journalist and in the early twentieth century, he learned the secret to success. He interviewed the world's most successful and influential people of the time includeing (but not limited to): Andrew Carnegie, Henry Ford, Alexander Graham Bell and John D. Rockefeller. By interviewing these individuals, Hill came to understand what they had in common and why all of them were successful. He found that the secret to being successful and getting people to like/hire you (the primary motives behind “personal branding”) was surprisingly simple.

Here it is, the secret to success:

1) Have a positive mental attitude
2) Have a passion for what you do
3) Continue to persevere, even in the face of defeat

Everyone should read at least one of this man's books


What exactly does this have to do with LinkedIn and social networking? Here's the bottom line: Success is earned by genuine, hardworking individuals who want to see their dreams become reality and LinkedIn (along with other social networks) has little to no effect on these things. Trying to get a job on LinkedIn is like participating in a shouting contest with millions of people. You aren't going to get noticed simply by listing a bunch of skills that you supposedly know and talking yourself up over the internet. LinkedIn has millions of users doing just that. Success happens when you start taking real world actions to move yourself forward. I'm talking about working hard and enjoying every minute of it!

Now, before I finish up, I'd like to clarify that I don't think LinkedIn is bad. I am simply saying that standing out (in a good way) to employers on social networking sites is nigh impossible and the effort would be put to better use by doing more work.

It's a shame that most young people today have probably never heard of Napoleon Hill and it's painfully obvious that a large chunk of humanity has forgotten the virtues that he described in his books.

I'll leave you with one of my favorite quotes from one of my favorite computer scientists:
“Talk is cheap. Show me the code” -Linus Torvalds

QR Codes: Not Just for Advertisements!

It seems like they are everywhere these days. On fliers, on soft drinks and even on coffee collars. What am I talking about? I'm talking about Quick Response (a.k.a. QR) codes. In today's smart phone abundant, social media driven world, it's hard to miss these distinct looking two-dimensional bar codes. Why, even as I type this post, I accidentally discovered one to my immediate right taped to a table!

When I first heard about Q.R. Codes, I was skeptical. It seemed to me like every time I saw one, it was stamped to an advertisement, beckoning everyone near to load up their smart phone with more advertisements. I was talking to a friend of mine about this, when he stopped me. You see, my friend had spent a year in Japan for studying abroad. Naturally, there were QR codes around every corner there, Japan being the place of origin of the QR code. My friend explained to me that QR codes had saved him from getting lost in Japan several times by providing a map of the surrounding area. I was intrigued. It was at that point that I decided I wanted to learn more about this interesting technology.

What really amazes me about QR codes is the design. While making the QR code for this blog, I discovered that it was possible to erase sizable chunks of the code and it would still be readable. This is due Reed-Solomon error correction algorithm used in modern QR Code standards. The idea behind the error correction is that the message being encoded can be broken up into multiple, redundant blocks. This allows some of the data to be corrupted (as long as said data is present in one of the redundant blocks) allowing codes to be customized aesthetically.
Anatomy of a QR Code
This QR code still works!
Although I initially dismissed the usefulness of QR codes, after researching them my opinion has changed. QR codes are an elegant way to empower users through technology and further bridge the gap between the physical and digital worlds. Next time I see one in the wild, I won't be so quick to judge!

Friday, September 6, 2013

Social Networking in Relation to Data Security

    While the widespread influence of social networking is undeniable, this convenient way of sharing information is not without consequences. Because of their increasing popularity, their ability to span multiple platforms and the lack of concern amongst users, social networking sites have become an easy target for cyber-criminals.

    In the field of information security, it is widely regarded that a system is only as safe as its weakest link. While most major social networking websites have a sizable team of security experts devoted to keeping networks and systems safe from intrusion, security issues within social media is not (usually) caused by poorly designed security systems. The main concern in regards to security in social networking is the amount of care taken by users to keep their data sensitive data out of harm's way. This is a problem both for businesses and individuals who use social media for personal reasons. The users of these platforms often have little to no knowledge of good data security practices and often resist taking precautions such as encrypting data and creating passwords long enough to resist brute force attacks. Many users are also vulnerable to social engineering and will willingly give sensitive information to attackers posing to be part of the system they are using.

    As social media becomes a bigger part of our lives, it is important to adopt safe data security practices. When one's identity is tied to a social media site, they risk everything from public embarrassment to identity theft when they don't do a good job of protecting their data. Don't let yourself be the weakest link in your own data protection plan!

Thursday, August 29, 2013

Hello World!

Hello there, reader! This is my first blog post. My name is Jonathan Vaccaro and I am a 4th year student at San Jose State University studying computer science. I really love my major a lot and I have been working towards my goal of becoming a career programmer since the age of 10.

I feel that what is most exciting about computer science is its vast potential to solve problems that were very difficult or even impossible to solve before computers were invented. For example, telephone networks and railway systems were very difficult to manage and maintain in a reliable way before the invention of computers.

I am very excited about the bright future of robotics and automation in particular. During my first three years at San Jose State, I had an internship during holidays where I worked with an inventor at his start-up company. This company, Cognisense Labs specialized in automation through the use of vision processing, artificial intelligence and robotics. During that time, I realized that there were many menial tasks in existence that could be automated in order to improve efficiency, keep people out of danger and make the world a better place.



I am also very interested in operating systems and operating system design. I am very fond of Unix and I mentally reference the Unix Design Philosophy as inspiration for every program I write.


cout << “thanks for reading!”;