All posts by todd

The Scourge of External Power Supplies

I accept that for portable devices that run off battery, having a completely seperate power supply makes a lot of sense, especially for small devices like phones and MP3 players. Why though, do appliances which will rarely if ever, move from the place they are installed need to have their power supply seperate from the appliance? I performed a quick audit at home and found 11 things that fit this category, most of which usually sit in a big tangle behind my desk.

This came to a head for me over the weekend. As part of my ongoing renovations I have put the centre of my network in a new cupboard under the stairs, which is the central point of the house. In this location I have moved my switch, cable modem, router/wireless and NAS box. Each of these devices have external power supplies. No matter how many options I tried I have not been able to prevent this rather simple setup from becoming a clutter of cable. The problem is that each power cable has a big chunky box in the middle of it which makes it impossible to neatly cable tie them out of the way.

I understand that there is probably some efficiency for companies that have multiple appliances to go throught the AC power regulations and design once for an external supply, rather than once for each device. That does not help me though. Even the design of them, with the input and output cables on opposite ends, makes the cable management of fixed devices hard.

If we cannot eliminate external power supplies can we at least do one of the following in order of preference:

– Have a common DC power standard that can use a single power supply for multiple devices. With each appliance having different voltages, size of connector, and core polarity its hard to mix and match. If I could have one power supply connected to AC under my desk, and DC cables going from this to the device. I would also not have to worry about which adapter goes with which device when I move my desk again.

– Put the input and output lines on the same side. At least then the cables can be looped or run through a cable guide easier. The adaters could be hooked to the wall with the cables neatly running down from them.

– Integrate the plug into the adapter. Even though this crowds the power board sometimes I find it preferable to the ‘block in the middle’ approach.

I have a few items up for replacement in the near future, when I do I am going to try and find items that don’t have external power. I don’t know how much luck I will have.

Australia to get Fibre to the Premesis

The Australian Government has announced that they have scrapped the tender process for a company to build an FTTN network, and will instead build its own new network that will put fibre out to 90% of all Australian homes. This is a FTTP which means that apartments will be sharing access to the fibre rather than getting one each. Home owners though will get 100Mbs fibre right to their front door.

This is great policy in terms of IT, economic stimulus and social equity. Surprisingly good from the same policy makers that brought us the great Australian firewall (more on this later). Rather than leaving it up to businesses to make economic decisions on where they can make the best profits from placing infrastructure, the government has decided to ensure everyone gets a chance by funding access for everyone. Suppliers of Internet services will then provide their services on top of this. This is identical to other utilities where the government supplies the base infrastructure to the homes and the service providers pay for access to provide their services over it.

Not only does it help to ensure everyone in the country has access to high speed broadband, it also increases the ability for true competition to exist in the market. Currently all of the Telco infrastructure is privately owned by the major carriers who despite some legislative controls, can block access to their competitors. This will allow smaller and local businesses to offer competitive options for Internet services and creative bundling of services like phone, broadband, television, etc.

What will still be lacking is the quality of service to the outside world which is restricted by the distance we are from everything. Our links to the US are not large enough to get good throughput from there now, and upping the links to everyones homes will not improve this situation. It could produce new options for local businesses though for high definition content hosted within country.

I am not holding my breath until the roll-out reaches my house as it will be an 8 year project in total. We might finally get to the technological level promised for the 90’s.

Warner Bros to Buy Pirate Bay, today of all days.

So in early news out today from built an exciting and powerful media platform that complements Warner Bros’s mission to organize the world’s information and make it universally accessible and useful TorrentFreak Warner Bros has decided to buy the Pirate Bay as they have “built an exciting and powerful media platform that complements Warner Bros’s mission to organize the world’s information and make it universally accessible and useful,”

Is it just me or has the quality of April Fools jokes been declining in recent years? I hope we shall see some better attempts as the day progresses.

The “April Fools” comments that get placed on the actual articles comment stream also do a little to take away from the effectiveness of the ploy.

Will IBM buy Sun?

Big news around the traps is the potential of IBM to buy Sun for a projected $6.5B, which is about double what the company was valued at when the announcement came out. It has been rumoured for some time that Sun had itself up for sale, it’s market share has been decreasing regardless of what they do to stop it and they risk burning cash to keep themselves afloat. Their latest 10K shows that in Quarter 2 they dropped in revenue, yet increased in costs despite having a significant round of cost cuttings and redundancies.

I don’t see any significant product advantage to IBM from this move. While they will gain some market share, history would suggest they would be lucky to keep half of what Sun now have. Then the cost of transitioning Solaris customers to either AIX or Linux would only be high. The only higher cost option being to keep AIX and Solaris going in perpetuity. There is also the impact of having to continue support for older versions of Solaris while the talented ex-Sun people stream away to other companies.

While IBM would gain access to Java, they have pretty much open access to it already without having to spend the developement dollars. And in every other crossover market, database, tape, storage and services there would be similar prospects of difficult product line merges.

If this does go ahead, I would think this probably has more of a blocking move than improving IBMs product line or market share. Sun are arguably a bargain price at the moment. Before IBMs interest became public, Suns market valuation was about equal to its equity position. Even offering double that this is a better deal than most tech takeovers. Sun has around $2.6B in cash and equivelants, which makes the real price around $3.9B, and splitting off and selling a few divisions will bring that down even further.

I would guess that IBM is worried about someone else buying them and getting quick access to a market they don’t compete in. Someone like a Dell, Acer or Cisco. Even possibly a Lenovo. Dell for example has $8B in cash and no Unix or credible high end services. Bidding big and bidding early might let IBM snap Sun out of a potential rivals hands.

The market seems to have decided this deal is going to happen. Sun shares rose nearly $9 bringing their market cap up to almost the IBM bid price. This means that the stock market is willing to bet $9 a share for a $1 a share return.

This doesn’t seem good for Unix customers though. They will go from having 3 mid-range options to 2. It will likely be good for Microsoft and the Linux vendors though as it will probably give a kicker to the companies moving away from Unix.

The universe actually exists.

I am sure that most of the GNC audience are familiar with the names Schrodinger, Bohr, Heisenberg, Young and potentially Hardy. All of these people have been instrumentally involved in quantum theory, and specifically for this article, in uncertainty theory. This is the paradox that the act of observation changes the observed. The classic example of this is a photon of light, where it is not possible to measure both its position and its momentum accurately. In fact the more accurately you determine position the less accurate your momentum observation can be and vice versa.

This goes beyond the observer effect, where the physical interaction of the observer directly changes what is measured. The paradox is that simply observing is enough to change the reality. While demonstratable in small particles the principle has been extended to larger objects like cats in boxes, trees in forests and even to ponder whether the physical properties of the universe we take for granted actually exist without our observation.

The Economist has reported on an experiment two groups of scientists have independantly taken to try and prove whether the universe actually exists when it isn’t observed. They have done this by performing experiments where they don’t actually collect all the data but using multiple experiments so that the aggregate parts were enough to draw conclusions.

The article is a little technical in that it uses casual throw-away lines like “photons are their own antiparticles, and are pure energy in any case”, but if your feeling really smart today here is a link to one of the actual papers. And just because I like it a link to an explanation of 10 dimensional space.

Is P2P the new music marketing arm?

When U2’s new album was “accidentally” released on an Australian website of Universal music a week before its official release I had some immediate suspicions. This is not the first time a U2 album has hit the download sites before its official release. In fact the previous album, ‘How to Dismantle an Atomic Bomb” got out early after Bono played his stereo too loud and a passing fan recorded a few tracks.

The outcome of the previous incident was U2 getting the biggest first week sales they ever had with 840,000 copies sold in the US alone and triple platinum within a month, a feat their previous album had taken over a year to do. One thing that had been clear to a lot of people that paid attention to what was really happening, was that for a some types of music, P2P really helps sales.

In the case of U2, the press that accompanied the leaking of the 4 tracks helped to create a buzz that definitely contributed to some booming sales. I would not have been surprised to learn that the pre-releasing of the new album was part of a marketing campaign by Universal. Then my suspicions really got aroused when Sony decided to get in on this trick with iTunes Norway posting a Kelly Clarkson album a couple of weeks before release. This puts it on the verge of moving from a co-incidence to a trend.

I expect to see more of this. It may become the modern equivalent of when bands used to slot a new song into their live set to excite the fans, or the bootleg albums that used to make the rounds. While the press these issues engender will fade as it becomes more common, this tactic will go right to the fan base and rev them up for a great opening for a new release.