Sunday, December 15, 2013

Scientific Computing


There are a million uses for a computer in the field of science. A million. So as you may be able to guess there are about a million ways in which I could tell you scientific computing happens. But instead of boring you with endless lists of ways things happen in the lab I will just tell you a few ways that computers are useful to specific types of scientists.

The first is computational science. It is the science of constructing mathematical models and quantitative techniques to analyze scientific problems. Scientists write programs that model systems being studied and then use these programs on different sets of input parameters. The next is numerical analysis. Numerical analysis is defined as studying algorithms which use numerical approximation for mathematical analysis. Modern numerical analysis does not attempt to find exact answers, but instead to obtain approximate solutions while at the same time maintaining a certain small margin of error.  Symbolic computation is another branch of scientific computing. It specifically focuses on studying and developing algorithms and software for manipulating mathematical expressions or objects. These objects could be calculated to exact or inexact values.

Computational physics/biology/chemistry are the study and development of algorithms to solve problems in physics, biology, or chemistry. Computational physics was the first of the three to use computers to solve problems. An example of a problem solved in physics is the matrix eigenvalue problem, which finds eigenvalues and their corresponding eigenvectors for matrices very large in size. Lastly, computational neuroscience is the science of studying the information processing properties of the brain and nervous system. It is a way of modeling the essential features of a biological system in many ways. It generally deals with single-neuron modeling, development, axonal patterning, guidance, sensory processing, memory, cognition, discrimination, and learning.


Computer Graphics


Computer graphics are what make computer screens so fun to look at. What would computers be without graphics? Things would be quite a bit more difficult to learn without that handy image/graphic to reinforce all the concepts we learn everyday while online. The fact is that computer graphics are quite necessary and rendering them wasn't always an easy task.

There are a few different stages when it comes to generating visual images, and the first of them is 2D image or pixel computations. These deal with rotating or displaying images in a certain defined area and shape on the screen. Then things get a bit more complicated when you get into curve computations. These have to do with Bezier curves and matching these curves to other shapes to create images. Finally, we get into 3D computations. This involves rotating a 3D point or computing volumes of 3D shapes. It also involves taking a number of 2D objects and creating a 3D object from them, such as a mesh texture. It also involves intersection of objects in 3D and whether or not they are touching. These are just some of the light starter concepts when you begin to look into how computer graphics work and the algorithms behind them.


Monday, December 2, 2013

Communications and Security


What would we do without the internet? Or if I was to rephrase that question, what would the internet do without security? Would it be useful to us at all? I believe that it would still be useful, but not half as much. The internet would not be good for much more than information gathering.  What I am saying is that security is what makes the internet so useful to us all. It is what allows us to use the internet to transfer important data and to accomplish important everyday tasks online.

Without security, the internet is not good for much at all. That is why one of the greatest weapons to help you protect your security online is cryptography. “Cryptography has the power to provide secure communications, protect transactions, provide powerful privacy, and validate the integrity of information.” The one problem with cryptography is that most people don’t know how to use it or work with it effectively. There are numerous users on the internet (most of the general population in my opinion) who do not ever regularly inspect cryptographic security certificates from the secured HTTPS websites they visit. When these same average users install applications, they do not ever check whether or not they are from trusted sources (although even installing apps from trusted sources is not necessarily a guarantee of their security, either). The internet is growing at a frightening pace. At this stage of development, most new users to the internet have no idea about things like security awareness or security mechanisms they can use ot protect themselves. That is why we, as computer scientists, must pay close attention to making these cryptographic exchanges of information as foolproof and user-friendly as possible.


Artificial Intelligence



Very many decades have been spent in the attempt to emulate human intelligence with a computer. This was the original definition of artificial intelligence, after all. “The 1950s and ’60s believed success lay in mimicking the logic-based reasoning that human brains were thought to use. In 1957, the AI crowd confidently predicted that machines would soon be able to replicate all kinds of human mental achievements.” This was simply not true. Part of the reason for this fallacy of reasoning was because we still don’t really understand how the human brain works, which makes emulating it’s logical thought paths even more difficult. This is what caused a major shift in Artificial Intelligence technology: we did not understand what we were trying to emulate. So these days “Artificial Intelligence” as it’s called has changed shape to now fulfill certain discrete simpler problems at a time. “Today’s AI doesn’t try to re-create the brain. Instead, it uses machine learning, massive data sets, sophisticated sensors, and clever algorithms to master discrete tasks.”  

The fact is that computers lend themselves to certain types of tasks much better than they do to other kinds of tasks. The simplest example is that computers do not have any potential for emotion, only logic-based decisions. Computers need parameters in order to make a decision, whereas humans are capable of making decisions without any relevant data if they were so inclined. Even if a computer was to generate a random number, it would still have parameters on what kinds of numbers were within its domain or workable set. Due to these factors, I believe “true” artificial intelligence (emulating the human brain) is impossible for a modern computer to achieve. However, computers can achieve tasks that are useful to humans in so many other ways, why not redefine “artificial intelligence”? We have.



History of Computer Science



Computer Science has a rich history as it should, being “the science of using computers to solve problems”. There is no better way to live the easy life than getting a computer to do your work for you. Computers have changed the way the modern world works – in a big way. We depend on computers to do anything from basic mathematical calculations all the way up to rendering graphics in 3D or telling us the shortest path between two points.  We have now become fairly dependent on computers to perform daily tasks for us. If we had to convert back to analog methods, or in other words if we were now forced to get along without computers, our future would be pretty dim. The rate at which technology is being developed would almost flat-line. Very few people would even be capable of simple tasks for a computer like calculating their taxes to send in to federal and state.

To me, the history of computer science began with number theory. How can combinations of things now stand for other things of use? Once digital logic was invented, it was the birthing ground for a real computer with which to study and further develop computer science. Around 1900-1939 were the years where the necessity for doing complex mathematical calculations drove the development of the early computer “calculation machines”. Then around the 1940’s, the first useful electronic digital computer was born to Howard Aiken with the assistance of IBM. In the 1950’s, “In hardware, Jack Kilby (Texas Instruments) and Robert Noyce (Fairchild Semiconductor) invented the integrated circuit in 1959.” It was not until the 1960’s though that computer science really came into its own as a discipline. In the 1970’s and 80’s, a public-key cryptosystem (RSA) was created and Apple computer brought on the personal computer, respectively. These days, computers are getting smaller and smaller, due to the birth of Nano-technology. Thanks to the “information superhighway”, the rate at which new findings or data is shared is simply astounding. The internet is also a large contributor to the pace at which computer science has developed in the last twenty years.


File Sharing


File sharing is just what it sounds like: sharing files with someone else either over the net or physically. Sounds simple, right? Well, it gets way more complicated when copyright laws come into play with the information that is getting distributed. File sharing has become a great concern for copyright holders in the digital media industry, namely the film and music industries.

To date, there are more than a few ways to share files on the net as well, which is where things really start to get complicated. One of these ways involves directly posting files to a server and allowing people to download them. This method is oftentimes not very reliable because the filenames are usually modified to prevent the owner of the information from knowing exactly what is contained in the download. Once the owner of the information finds it, they will force the server to take down the link to the file. Another method is called peer-to-peer or P2P for short. This is a method of sharing files where one person makes the file available directly from their computer to the other computer which is downloading it. Another method or style of P2P is called torrenting, and it deals with files that are broken up into tiny pieces and then a torrent file (.tor extension) contains instructions on how to put the pieces together again. Numerous lawsuits have been fought on the topic of file sharing, but no matter the number of lawsuits won or lost it hasn’t affected the desire for millions and millions of people to continue sharing these files back and forth every single day.


Data Structures



A data structure is “a particular way of storing and organizing data in a computer so that it can be used efficiently.” There are many different kinds of data structures. Usually one data structure is chosen over another because it is more useful to the task at hand than other data structures.  Some examples of data structures are arrays, records, hash tables, unions, sets, graphs and objects. The purpose of all of these is to manage large amounts of data efficiently. For example, an array stores its elements in a specific order. There is an index and an element. The index tells where in the array that the element is stored. The element in an array can often be of any data type. Arrays, for example, can usually always contain other data structures within them such as a nested array, where one array is nested inside another one.


Data structures are part of the beauty of modern high level coding languages. They can store data in many more ways than just the few primitive data types of many years ago. Being able to store data in so many more complex data structures shortens the amount of time and code that it takes to manipulate the data itself. The more modern data structures save even more time when it comes to entering large amounts of data as well. Overall, data structures are one of the most important concepts to modern computer science because without a way to store the data, how could we ever hope to manipulate it?