If you blog it they will come?

Thursday, March 5, 2009

Warfare and network algorithms

How much does computer science owe to the military?

Well let's see...

  • The Department of Defense basically invented the internet, and with it "our entire field" to paraphrase the UW CSE department chair during one of the lectures he gave...on operating systems.

  • The Ford-Fulkerson network flow algorithm used to route traffic through nodes was invented in 1956. Not for sending oil along pipelines or routing civilian traffic--to determine if the Russian military could successfully move enough troops into neighboring countries using the roads currently in place to successfully invade them. The Cold War is responsible for so much useful science!

  • The limitations of reliable communication with TCP were known long before, based on the notion of two separate armies coordinating an attack on a territory from two fronts. If only one attacks alone, devastating losses are incurred. If both strike simultaneously, the territory is easily conquered. The only problem? Trying to agree on a time to strike, with an unreliable messenger running back and forth "routing" messages between the two on horseback or foot. What protocol would you use?



It's nice to think that most of the early computer science leaps and bounds were born out of the circle surrounding the hippie 60's XEROX PARC culture, but the truth is far more...crew cut.

Tuesday, March 3, 2009

Write-Only Memory

It is real.

Here it is, in all its Wikipedia glory

Out of frustration with the long and seemingly useless chain of approvals required of component specifications, during which no actual checking seemed to occur, an engineer at Signetics once created a specification for a write-only memory and included it with a bunch of other specifications to be approved. This inclusion came to the attention of Signetics management only when regular customers started calling and asking for pricing information. Signetics published a corrected edition of the data book and requested the return of the 'erroneous' ones.


And here is the April Fool's prank pulled by Signetics:

Saturday, February 21, 2009

Dynamic Programming

Dynamic programming is one of the more interesting Computer Science techniques I've learned in my time as an undergrad. And I can't believe I almost received my diploma without really knowing what it means.

An Algorithms course isn't required to graduate, and I didn't realize at first exposure to the term that dynamic programming has nothing to do with writing code or dynamic languages. It's an optimization technique to simplify problems which could otherwise blow up exponentially.

It applies to nearly ever other subject I've held interest in, such as computer vision (with convolutions) or graphics (culling) or optimization in game theory. Never mind that Algorithms has crucial implications for other topics such as Networks, Databases, AI...every computer science topic is some goal-oriented application of specific algorithms. But before I digress....

Formally learning dynamic programming and its technique of solving subproblems with memoization has only increased my interest in the technique:

Until now, we've focused on greedy and recurrent algorithms in this algorithms course. Although we began by defining an efficient algorithm as any that ran in polynomial time, until this point the aim was to achieve O(n) or O(n logn) at worst, and rarely O(n^2) solutions as acceptable.

With dynamic programming we are not nearly as optimistic. The problems have constraints or conditions which render prior techniques as overly simplistic and ineffective. We creatively combine both greedy and and recurrent techniques and use swaths of storage space in hopes of achieving polynomial runtime, meaning that O(n^3) or O(n^4) is not only acceptable--sometimes it's the best we can hope for.

Despite the pessimism, dynamic programming drastically reins in very difficult problems and occasionally leads to slick O(n) solutions for problems which may otherwise run as O(2^n). It's now one of the most sophisticated, versatile tools I have in my belt--I only wish I knew it before taking several other courses.

So back to ranting, on my disbelief that the algorithms course is not required to graduate from the department. I realize it's a difficult time-consuming proof-based (everyone hates proofs, although I hate them a little less finally) course that must compete with other "essential" courses in a limited schedule space. But the thought that I nearly graduated without taking this course is scary given how fundamental it is to everything other course and topic in computer science.

True, I may never prove anything about the runtime to an algorithm I invent in The Real World. But any amount of formal reasoning is far more valuable than being "pretty sure" the solution works. Not to mention using solved problems as templates for real world stuff, or just making positive impressions at job interviews. I'm glad I'm taking algorithms now, especially since I can't travel back in time and take it any sooner.

Friday, February 20, 2009

9 hours of messing with iPhone SDK

The iPhone feels like a new frontier, at least for non-early adopters such as myself.

After learning about Walk Score, I was in a bit of disbelief that their application was not available on the iPhone. Their lookup simply requires a latitude and longitude, which the iPhone readily provides. I dreamed of running around while looking for an apartment and pressing a button on my phone to get its walk score.

So knowing nothing about Objective C, mobile development, XCode, or much else related to Apple development I set out to whip up a simple mashup of iPhone + Walk Score today and went at it for about 9 hours.

Unfortunately I haven't yet produced anything close to finished.

But I learned quite a bit.

For example, I learned that diving straight into the iPhone samples was a fruitless, frustrating process. The interface builder was not intuitive and I spent over an hour trying to add functionality to a button.

After downloading a 50-page tutorial on "Your First iPhone Application," I learned how rigorously the SDK enforces the Model-View-Controller principles, such that a simple Hello World app requires substantial scaffolding (50 pages worth!).

Additionally I got a taste of Objective C, a taste I have not yet acquired. It's more verbose than Java, yet is dynamically typed with difficult syntax and a reasonable number of dependency issues.

I learned that Objective C prefers a SAX style XML parsing.

I figured out how to send HTTP queries to Walk Score, parsed their XML response, and displayed a score for a fixed latitude and longitude.

After this I wondered if a browser based approach held more advantages. It would spare XML parsing and simply direct users to the search results page for their coordinates. This idea held appeal, since I would reuse the existing browser functionality, and some scores took time to calculate and would require a redirect anyway.

As my networks professor mused a couple of days ago, "Perhaps one of the best programmers I've met is also one of the laziest programmers I've met, he never does more than he has to for his code to get the job done. Maybe his laziness is what makes him so good." (He was relating it to the design decisions behind NAT)

After learning how to load web pages using an embedded browser, it came time to retrieve live GPS coordinates. Unfortunately, this is not as simple as a one-line API call. Using GPS is expensive battery-wise and is best done sparingly according the dev docs. For this reason a custom manager is required, which I was hooking up before deciding to call it a day and write this developer diary entry.

Perhaps later this week I will wrap up the GPS retrieval and integrate it into the iPhone app browser call. But I also learned it might not be worth it:
--Walk Score's heat maps show certain places, such as San Francisco, have near universal scores of 80-90 with only small pockets of lower quality. If the result is nearly uniform, is the experience of walking around with this app anything like I imagined?
--Apple has a certification program in place. Testing and verifying the app might be a costly endeavor, and it might not even find its way into the Apple store.
--Now that I learned some basics, I find myself less interested in this project versus what I could now begin working on.

All in all, a great experience. I'm still very excited to see what I can produce on this platform, even if I have to hold my nose to write Objective C...

Wednesday, January 28, 2009

Company Profiles

Ever since working for software companies which were spinoffs of other software companies which were spinoffs of other software companies, I've wondered what a tech company family tree would look like.

Although it runs slowly in the browsers I've tried, this map reveals an amazingly thorough time and space relationship graph for Puget Sound tech companies.

On top of this, LinkedIn has beefed up their company profile pages with illuminating statistics such as incoming and outgoing career paths. Plus, statistics on  median age of employees and breakdown of which titles constitute which percentage of company employees sheds light on facts which previously would have required some form of insider knowledge.

Now, for example, I know that prior to joining Microsoft, employees often worked for HP or Oracle, and after leaving tend to arrive at Amazon or Yahoo. I also learned that median age of a Microsoft employee is 34 years old and 13% of its nearly 90,000 employees are software engineers.

Its juicy information like this which I covet for visualization purposes--perhaps simultaneously depicting the flow of workers in and out of a company as well as size and age. I'll have to see if I can snag myself an API partnership.

Tuesday, January 13, 2009

"Artifacts" of taking Graphics/Computer Vision courses

from http://flickr.com/photos/csb13/99025844/

  • When I am in a restaurant I sometimes notice the reflection and refraction of lights and objects through my water glass and try to trace the beams or other objects back to their origin. This is especially true in restaurants with colored lighting.


  • When I am walking around areas with lots of solid geometry like between passageways of buildings on campus, I sometimes imagine that the world is translating or transforming instead of me, the viewer (since they produce equivalent images often the approach in graphics is to shift the entire world). I might sometimes figure out where the vanishing points are relative to all the edges and corners.


  • I am aware of my internal edge sharpening and occasionally notice the effects of mach banding, both phenomenons which occur as the result of signal processing built into the wiring of the rods and cones of the eye itself.

Monday, January 12, 2009

Diving into "Dive Into Python"

Well I'm now five chapters deep in Dive Into Python, which is available completely for free as a PDF.

It's an older book, but still relevant for its unusual and fantastic approach of teaching a new language to experienced programmers. The approach seems so obvious, and yet it's difficult to come up with examples of other titles which follow the same approach.

Guido

Each chapter begins with a short code example which highlights the power of Python while performing non-trivial tasks such as outputting the language's own documentation or displaying directory listings with file metadata.

These examples highlight the succinctness of the language while including a new language concept per chapter, effectively booting the reader into a frigid pond of fresh semi-obtuse language constructs ('Get thrown into freezing cold pond of python' doesn't have quite the same ring though).

The remainder of the chapter is always devoted simply to exposing the constructs and syntax of the example; both what and how it all works, line-by-line if necessary. Code samples are available from the website, and the chapter encourages a hands-on approach of running and modifying sample code.

At each chapter's conclusion, a checklist of concepts is presented and the semi-obtuse sample seen at its start reads as clearly as a language you've known all along.

It makes perfect sense that the best way to learn a language is by dissecting code beyond one's current reading level and then changing and authoring examples to reinforce it all. It's a bit strange that fewer books go as far as this one, but it's possible the "by Example" series follows a similar tack.

Dive Into Python's chief drawback is it's age--the book was written way back in 2003, the time of Python 2.3 (it's now the age of 2.6 and the dawn of 3.0). Still, this is a much more fun way to learn than reading, for example, Python in a Nutshell, as I was doing (it's more of a reference manual anyway, though more up to date).