All posts by Matthew Greensmith

Surviving redundancy: It’s all in the planning



If you were unfortunate enough to lose your job your goal should be to give yourself at least 6 months of time to get a new one.  You do not want to be struggling to pay the bills before you get another job.  The main point of this is to allow yourself time to find a job that suits you.  The perfect job might not exist, but you do not want to be forced to take a position that might not look good on your CV because the debt collectors are closing in.

Once you lose your job all you can really do is tighten your belt and make sure you talk to your creditors before they call you.  There are a few things you can do to prepare yourself in case a job loss happens in your future.  You may already know some of them, and none of them are new ideas.  I hope collecting these into a short list may help.

You need a buffer:

Don’t spend you money as fast as it comes in the door, definitely not faster.  For many poeple keeping a buffer of money in their bank account is almost impossible, so set up an automatic payment that puts a few hundred into an investment account after each pay period.  Make sure the account is a little hard to get to, no linked plastic and 24-48 hours to get money out.  You will be surprised how quickly this will build up and making it a little hard to get to stops the impulse buying.

Avoid credit (especially cards):

Credit cards can make life easier, but maintaining debt on them is expensive and the second worst form of debt there is.  If you can’t pay them off before each month when the bill comes, tear them up so you are not tempted.

The worst form of debt is payment plans as they are not only a millstone around your neck but also indicate that you paid too much in the first place.  There is no such thing as “interest free”, no interest in the payments mean it is buried in the purchase price.  This is part of the reason why you can get a discount on most large items if you pay cash.  As a general rule, if you can’t save up for it before hand, think about whether you really need it.

While having these credit lines is generally manageable as long as kept under control, if you suddenly find yourself unemployed for a while, it is spending you cannot cut back on.  While you can stop saving for a new TV until you get a new job, you cannot stop the payments for the one you already have.

If you are buying something today that will be worth less in the future, credit is almost always a bad way to buy it.

Keep your resume up to date:

Obviously you want to get your resume out quickly if you lose your job.  When you are updating it at the last minute though, it can be hard to capture everything that could help you win the job.  When you complete a successful product, get a promotion, receive and award, or complete a course, stick it into your resume straight away.  I keep a master resume which holds all the key facts I might want to mention.  I then cut that down to only the ones that are applicable to the job I am applying for.

Practive interviewing:

Handling yourself in an interview well needs practice, so do it occasionally.  Get together with a friend or relative and have them role-play an interview for you.  A search for sample interview questions on Google nets about 66 million results so it is not hard to get a few to practice with.

Maintain your network:

Know who the recruitment companies are who service your area and potentially even get yourself on their books.  Keep in touch with ex-collegues that you liked.  These are the avenues for you to find the jobs that don’t get advertised, which are usually the best ones.

When it happens:

If you find yourself out of work suddenly you will have the twin benefits of being able to ride out 6+ months for the right job to come along, and being ready to start discovering the jobs that are available and applying straight away.


How heavy is a useable Petabyte?



The team at MatrixStore have a post up calculating the weight of a petabyte of storage today compared to 1980.  Needless to say todays weight was a lot less.  The article was inspired by a post on Gizmodo illustrating how big a PB is.  There are two problems with the calculations though.

  1. 2TB is a marketing number.  The formatted capacity of 500 2TB drives is more like 916TB
  2. The weight is for the drives alone, which is not storage you could actually access and use
Image courtesy of WD

If we want to use 2TB drives we need a system that can hold 3.5″ drives.  The highest capacity tray I know of for these drives takes 48 SAS or SATA drives in 4U of rack space (about 7″ x 19″ x 24″).  You can sit multiple of these behind a single RAID box which would provide access for your computer over IP or FC depending on the type.

To get a real petabyte of base 2 usable storage you would need 546 2TB drives.  The whole setup including racks and power would weigh 1400kg or 3100lbs.  It would also consume 12kVA of power spitting out 39KBTU/hr of heat.

546 drives is for a PB of raw storage though.  In reality you would need to protect it from drive failures using RAID.  If we go for as littlle overhead as possible we can create 24 disk RAID 6 sets which would have about 40.4TB useable storage each.  For this we need about 600 drives which adds another 100kg to the weight.  Still about 1/180th of the weight of a PB just 30 years ago.


Mac = Meh



I decided to jump on the bandwagon in May and purchased an iMac as a replacement for one of my PCs.  While I was not expecting an epiphany I must say I am a little underwhelmed by the experience.  Apart from the small differences in mouse use and where to look for menus I have really not found it any different from a windows PC, and definitely quite similar in feel to my KDE desktop.  I guess I expected to be more impressed with my experience.

Image courtesy of Apple
Image courtesy of Apple

So far the Mac has been justifying the main reason I purchased it admirably so far.  The PC it replaced was the family one and even with anti-virus software I was still finding myself cleaning out some sort of trojan or malware every few months.  After 3 months I have had to do zero maintenance on the Mac, so in that regard I am pleased.

Most of the actual issues I have with the platform probably relate more to my experience level than any inherent problem, for instance OpenOffice is still not running as well as I would like.  I am also not that impressed with Safari even though v4 is a definite improvement (I see a Firefox install in the near future).

What I do really like is the quality of the design.  The only cable involved is power and it looks good enough that I am considering moving it out into the family area.  The one bundled app I have fallen in love with though is iPhoto, especially the face recognition feature.  I haven’t found the photo editing any better though.

While I am not dissapointed in the Mac, it has not impressed me enough to change my main system from a Windows/Linux machine.  If any of the experienced Mac hands out there have any suggestions on how to improve the taste of the kool-aid please send them in.  As long as the box stays trojan free it will remain appreciated.


Why Blu-Ray is still doomed!



bluraylogoI know there are some good things about Blu-Ray, and have been impressed by the quality I have seen on Blu-Ray movies on a friends PS3.  Those of us skilled at pattern recognition will continue to avoid this doomed platform though.

With the recent announcement that Sony has dropped the UMD standard on the new release of the Playstation Portable (the PSP Go) we get to see yet another example of my oft repeated advice.

Never invest in a Sony controlled data storage medium!

Sony have tried to play in various storage markets before with completely Sony owned technology, and I cannot think of a single one where they eventually triumphed even when they started out technically superior.  To be fair to Sony I do not think they have specifically been bad at maintaining their technology it is simply harder for proprietary technology to keep up with open standards.  This is exagerated when you are working in an OEM environment where your customers are highly motivated to break your monopoly.

Sony used to be able to artificially extend their technologies by having really good equipment and bundling the technology in.  Now with Sony no longer having a quality edge on most of their conpetition it is harder to do.

Beta tape was much better than VHS but eventually was overtaken and disappeared.

DAT (Digital Audio Tape) was an alternative to CD’s which hung around for a long time in professional music circles but never took off in the consumer market.

AIT was a successor to DAT designed for the low end data backup market.  Despite being late to market it was making inroads on the similarly closed source DLT.  Then DLT was open sourced and wiped AIT out.

Minidisc never really made it outside of the Sony umbrella, and very little music was actually released on the format.  Once the other MP3 players moved from CD to hard drive or solid state minidisc died a quick death.

MemoryStick only survives by being the only option on many Sony products.  No other manufacturer uses the product and it is behind in capacity and more expensive.

The dark plastic “CDs” that PS1 games used to come on that even the PS2 struggled to read and ended life before the platform it was designed for.

Now UMD joins the pile of Sony data platforms defunct much quicker than any comparable open standard.  If you have bought content on a specific medium, I think it is reasonable to expect that you will be able to buy a new player for that content for at least the next decade, and that the cost of those players would go down over time.  This has generally be possible with any other standard in the past, but almost never with a Sony platform.


The Plan to Fix the Internet



Fellow Ohana, although it might be a highly optimistic claim I believe I might have the beginnings of an idea to save the Internet from itself.  The audacious goals of this plan are to

  1. Ensure users get the value they pay for
  2. Reward contributors for the value they provide
  3. Give incentive for quality without blocking freedom to post anything
  4. Provide incentives to make the Internet faster and more efficient

internet_mapIn this post I want to lay out the basics of this plan to see what you think and gauge the reaction.  If we think it will work we can take the Ohana Internet Plan (OIP) to the world, and think up a title with a cooler acronym.  Bear in mind that this is a rough outline of an early stage idea, so feel free to rip it to shreds.  Just let me know what you like about it as well.

I have been thinking about two seemingly seperate problems with the Internet recently, which I have talked about in previous posts here on GNC.

  1. Metered Internet is essentially looking inevitable, so if we have little chance of stopping it how can we turn it more to the advantage of users than the ISPs currently plan?
  2. How can contributors of quality content claim the fair value?  Without them the Internet ceases to be but they struggle to attract direct revenue, and advertising is a flawed revenue model.

It has occurred to me that these two problems could have the same solution.  Metered Internet that credits you for uploads.

The credit rate would of course have to be lower than the download rate otherwise there is no profit for anyone.  The difference in rate would be largest at the edge and smallest at the backbone.  The broad results would be:

  • It’s always profitable to forward traffic.  Net Neutrality solved.
  • The more traffic an ISP forwards the better for them.  Incentive to build fatter pipes.
  • The more traffic the ISP can keep off the backbone the better for them.  Drive for better efficiency.
  • Popular content will receive revenue for it.  Rewards for producing good content and less crappy advertising.
  • The return is better the closer your content is to the backbone.  An incentive for aggregators like YouTube to offer to content producers to host their content.
  • Higher definition content is worth more to producer and consumer.  If rich content is valuable to your viewers it is worthwhile to produce it in a variety of qualities and let them self select.

Some example numbers, using nice round ones for ease so lets not get too hung up on the value.  Lets say that the base rate for a 10GB chunk of data is set to $1.00, this is the rate at the backbone to transfer data from one place to another.  In order to get the money to maintain and improve itself it charges $1.01 to the destination of a chunk and pays $0.99 to the originator.  For every chunk it transfers it makes $0.02.

Next is the wholesale ISP that connects to the backbone.  It has wholesale supply deals with local ISPs (we are using a simple model here).  It charges $1.30 per chunk and pays out $0.70.  Whichever way the data flows to the backbone the wholesaler makes $0.29.  If it can get information from the source to the destination without touching the backbone it gets more than double, $0.60.

The local ISP provides the service to your home.  It charges $1.60 per chunk per download and pays $0.40 per upload.  This company then makes $0.30 for each chunk regardless of which way it goes.

A key point to all this is that everything that you upload someone else must agree to download.  This not only rewards people directly for a valuable contribution, but also will cause limits to rubbish.  Most sites are going to charge to accept your content with a share in the payments back to you if it gets downloaded.  And that share will have to be substantial to attract you to post your content to them rather than host it yourself.

In regards to general internet surfing this plan would have little effect.  For text based content the incremental cost would be small enough to be insignificant unless you attracted an very large readership.  And I truly believe that a metered Internet would end up costing us a lot less than ‘unlimited’ plans cost.

The problems I can see already:

  • This cannot happen without Government backing
  • Spam, do you want to pay to receive it and the originator could get a return for sending it.
  • Calculating what the actual rates should be is difficult
  • It will require a metering method that people can trust as accurate

So let me know what you think.  Remember that perfect is not the goal, we are striving for better and fairer than what we have now.

Image sourced from Matt Britt under Creative Commons 2.5


How many times has Todd’s water been wee?



In #479 Todd mentioned about how much happier he was to be drinking tap water than the re-cycled urine the ISS occupants were looking forward to.  Now I was taught a long time ago about the water cycle of ocean, to rain, to river to ocean.  During this process animals drink it, or eat it in their food, then dispose of it in urine.  It got me thinking on a strange tangent about what the chances that Todd was actually drinking re-cycled without knowing.

So armed with Google and some very liberal over-simplification I have made a quick back of the envelope calculation.  There is no point in making any claim of accuracy in the amount of urine produced per day over all of time.  Taking today’s population of humans, cows, pigs and sheep we get roughly 87 Billion liters per day which will be substantially less than the actual total.  It needs to be because I am going to assume that this same volume is produced every day stretching back to when large animals are first recorded as being present 230 million years ago.

There’s a lot of water in the world, approximately 1.4 trillion cubic kilometers.  At 87 Billion liters a day it would take 16 Billion days to convert it all to urine.  Since the recorded beginning of large animals though, there has been 84 Billion days.  That would mean an average of 5 re-cycles for any given amount of water.

In reality there are lots of complications with these sums even outside the extremely inaccurate (but lowball) daily volume.  A lot of the water we drink leaves in sweat and our breath, and a lot of the water in urine comes from breaking down sugars fats.  The water gets into these through plants and then the animals in the food chain.  I think the numbers are good enough to make a solid claim that at least 10% of any volume of water has previously passed through a urinary tract.  The ISS is just increasing the percentage.


How can RSS be fixed?



I am sure a lot of you have seen Steve Gillmor’s article on TechCrunch talking about how he has stopped using RSS because of the small amount of quality information he gets from it compared to the large amount of dross he must wade through to get it.  In his case he is looking to replace it with Twitter.  I personally think he will find the situation the same there over time.

Firstly the problem is not with RSS itself.  RSS is only a notification method for content that is on websites.  In a sense what he is getting from Twitter is a similar thing, with the added benefit of the content being edited by a person before it gets sent.  I can completely see how this would be better for the way Steve states he looks for content.  According to his article, he added people likely to write interesting things to his RSS.  This is an attempt at a personal(ish) connection that RSS just cannot give accurately.

RSS will give everything that gets posted to the site that meets the rules set for the feed.  For example, if you took the GNC feed to get more Todd, you also get myself and all the other authors (lucky you!).  If all you really wanted though was to hear more about what Todd was thinking you would be getting a lot you didn’t want.

A title and part to all of the article is also not a great way to get the synopsis of the article.  The brief summary may not give you a correct sense of the content.  Also the content posted to websites/blogs is often a crafted piece to some degree rather than a brief summary and pointer to interesting information.  This is a limit to how much of the personal you can expect.

Twitter on the other hand enforces brevity and clarity with its character limit.  It is also simple and quick enough to get a higher rate of posting.  There is also a level of self editing of the content that goes to Twitter.  While some people tweet everything they do, others will only tweet their best stuff, or post links to relevant commentary to an ongoing discussion.

In short, if your goal is to be more connected to the information produced by the people, I can see how Twitter could work better than how RSS currently does.  For myself I use RSS to get updated when specific sites I like have new content, and I am looking for specific topics so have many of the broad based sites limited through keyword filters.  RSS still works better for me.

My concern is that the level of quality that Steve is currently getting from Twitter may fade as more and more people get onto the system and as Twitter evolves itself into the yet-to-be-revealed money making version of itself.  RSS as an automated notification and information aggregation tool has a lot of power and  acceptance, is there something we can do to make it work better for us?