Monday, March 11, 2013

Compexity and Runtime

I am really enjoying what we're learning about complexity right now; the idea that we can do some sort of analysis on algorithms in order to redesign them with their efficiency in mind seems like it will be an important tool now and in the future. What I'm wondering now though is just how big an impact does this have on modern programming.

In CSC148 (Introduction to Computer Science), there is a large emphasis on object-oriented programming and using clever manipulation of objects to create data structures that have specific attributes and a kind of idealistic approach to programming in general; including using high-level language techniques. For example, in class, we're focusing on creating and manipulating trees which is certainly interesting and a good example for which we use recursion, but supposing we used these trees to store created objects which inherit from or are subclasses of other objects which are themselves sandwiched in a class structure, and we get into a scenario of needing recursive calls and to create many of these objects in order to store values, is there a point where we have to just say no because of the physical limitations of carrying out nested functions?  Which cases are best to procedurally program in and which are we better off creating our own objects? Is there a rule of thumb for complexity of code? If there is, how would it scale with Moore's Law? I know in enterprise software development there would certainly be questions about runtime. This subject is one of the big reasons why I enjoy playing around with Assemblers.

Sunday, March 3, 2013

Laws of Inference

This week we learned a few rules of inference which are pretty interesting; they're starting to give me a broader conception of how to create proofs. Before they were explicitly stated like they were, I was often confused about where we were going in class and how I was supposed to have the ability to infer certain behaviours or produce from 'common knowledge' properties specific sets of numbers, absolute value functions, inequalities, etc. (Which I now realize are in the course prerequisites). Now that I have access to a tool-set, I have a renewed sense of confidence in my future ability to actually prove something; I just need to learn as much as I can from the mathematical prerequisites. I need to study proving things much more regularly as well.

On the bright side though, I'm having no trouble reading or writing logic symbols any more, I feel somewhat literate in the realm of mathematics (Or at least what we've studied) now. I also had very poor time management skills at the beginning of the course, and couldn't keep up with my coursework (Note the 4 missing SLOGs), but I'm trying to redouble my efforts in all of my courses, especially this one.

Saturday, January 26, 2013

Arithmetic Laws

The grammar and representation of logical expressions seems impressive in general.

For Example:

Considerably more time goes into figuring out what this means and verifying it:

P ^ (Q V ~Q) <=> P <=> P V (Q ^ ~Q)

Than what this means, and verifying it:


P = P (1 + 0)

But they state the same thing really, especially if you look at it from a logic gate perspective. Which is why I'm cautiously optimistic about this course. Knowing that I know every Boolean algebra law so far discussed in this course gives me a certain degree of confidence in interpreting this weird world of logic. What I'm wondering now, is if its possibly a crutch that I am compelled by habit to convert expressions containing conjunctions and disjunctions into Boolean algebraic equations in my mind and convert them back once I've understood or transformed them. This method makes it magnitudes easier for me to understand expressions, but feels like sloppy methodology to me and that I'm cheating myself.

Beyond that I'm not feeling particularly struggled with any concept found in this course so far. It's definitely challenging, but I'm enjoying it, so I can't complain. Although trying to remember the technical definitions of every operator isn't quite as fun.

Thursday, January 10, 2013

Hello World