Will Microsoft Antagonizing Customers Work? [Stupid Microsoft Tricks]

Making a sidebar gadget to show the number of day of support for Windows XP is simply a bad idea. Not only is it a waste of space on a desk, only the fanboys will be using something like this.

It looks as though Mr. Ballmer has really lost it this time, and the repercussions from this will be felt all over, as those who support Windows XP independent of Microsoft will begin to grab the lion’s share of that business, and Microsoft will have a problem selling to many who will be more than miffed by this puerile behavior.

What is really stupid is that the item is only for those already having Vista or Windows 7.  So how does it function except to annoy those who know about it, or see it on the machine belonging to anyone else? Couldn’t the time wasted to create this, or decide it should be released, better used elsewhere?

Jokes like this are what some companies lose lots of revenue over, as the paying customers tend to believe that any attempts at support will be half-hearted, or less. No one at Microsoft apparently knows the adage about catching more flies with honey than vinegar…

I think about it this way – in what other realm would a provider of anything benefit from the ridicule of those using one of their older products. If you are driving a Ford truck of 2001 vintage, and Ford puts up moving signs to razz those drivers by giving the signs away to owners of 2009-2011 model trucks, how happy will those drivers be? How many drivers of the newer trucks would take advantage of it? How many would be afraid of the damage done to their trucks by those driving the older products?

I know in California, it doesn’t take much to get your car keyed, and trying to prove someone did it is very hard.

If you’re among those who find this somehow useful, it can be downloaded (after you establish that your copy of Windows Vista or Windows 7 is genuine) from here.



Who Says Seagate’s Purchase Of Samsung HDDs Is A Good Thing?

Of course, that is how it is being sold. But when it comes down to the real details, will it be a good thing for Seagate and Western Digital to own 90% of the hard drive market?

When there is less competition (and make no mistake, there will be less), the importance of Samsung’s production of quality drives, keeping the price of hard drives down, which has benefited everyone who buys HDDs, will certainly be felt.

The perfect analog is the graphics business. While you may say that there is vigorous competition there, a second look shows that the competition is only vigorous when the two competitors agree there should be. There are certain pieces which roll on at their own pace, never falling victim to the price drops which would normally occur if there were a third choice. The example I can think of immediately is the Radeon 4740 cards. Though they were surpassed in performance relatively early in their lifespan, they never were victim to the continued drops in price of other cards. The pricing stayed above $130 everywhere as nVidia had produced nothing to introduce competition, and AMD kept the numbers small all through production.

What happens when the two drive competitors decide that there is only going to be competition on certain models? The choice then, is to pay extra for what you want, or go with the price performers that they decide are good for you.

With one less drive manufacturer, there will be one less R&D unit working, so no matter what gains can be had through greater mass of production (which at the volume of any of the drive makers would be extremely small) will be more than offset by the lack of drive to work more performance into the product.

The article which gave the news of the purchase also gave a quote by a person from an independent group –

“I think there will still be plenty of competition between the three remaining players. Ninety percent of the market will be split pretty much evenly between Western Digital and Seagate, with Toshiba playing in some niche opportunities,” said John Webster, a senior partner at market research firm Evaluator Group.

What else might they say? No one is going to really stand up and state that most of the competition is gone, as only a small flurry would be needed to wipe that notion from the mind of the greater public, before getting back to agreed upon business practices and pricing.

There will be some changes coming, for sure. The price drops of drives, as they are superseded by larger ones, will proceed in a much more orderly fashion, as there won’t be any third party upstart, making the big two play nicely. Toshiba, as spoiler, won’t work. They don’t have enough of the market to force anything – not even in the 2.5” drive market.

Some may say that this is a reaction to the Western Digital purchase of Hitachi, but is is different in that Seagate was larger than Western Digital until the purchase, and was still not hugely affected by the addition, as the Hitachi numbers had dwindled since the company was known as an IBM business, whereas Samsung was getting to be a very big player, both in moving disk and solid state drives.

The sad part is that nothing can be done about this, as it is so very large (and multinational) that no oversight group can complain with any perceived effect.

Life is far from over for the hard drive buyer, but it is certainly going to change significantly.


The Power Of Deduction

Yesterday, I was reading about the upcoming move of Best Buy to reduce the size of its stores and move into the mobile marketplace. I’m not exactly certain what is meant by the mobile marketplace, as the only mobile marketplaces I have ever been familiar with are those who work the various swap meets –  and that certainly doesn’t sound like the domain of Best Buy.

The stated reason for the change is that the company realizes how much commerce is occurring on the internet, which is their obvious way of saying that they are losing money.

That is certainly easy enough to deduce, as from a customer’s perspective, there is little point in ever purchasing anything from Best Buy. As someone who has been purchasing things longer than Best Buy has been around, and has seen a few chain stores (the term big-box is relatively new also) come and go, I feel that I am in a position to make some conclusions about the problems the stores are experiencing.

As I also have many years retail experience, I can certainly say that from a customer service standpoint, I have never been given the kind of treatment in a Best Buy that would make me return to buy if there was another choice. The employees, to a person, are all less interested in serving customers than whatever else might be happening on the planet. I have visited Best Buys where I at one time have seen 6 employees behind a counter all talking about a television program which was on the previous evening, and though I was in agreement with their general assessments of the show, it did nothing to help me in my search for a graphics card which was on sale, which would have put many dollars into the store’s register, as I would have purchased 5 of them. Their avoidance at eye contact with me, and general refusal to go beyond the realm of behind the counter made them useless to me, and to the store.

Instead, I made my customer a better deal and kept him waiting an extra day while I upgraded his machines, choosing instead to get the video cards from NewEgg. The reduced price to my customer did not reduce my profit one iota, as the everyday price of NewEgg was that much lower on the same products. That was 6 years ago, but today, with the cost of gas, and the aggravation of the experience, well, I think that most anyone knowing those facts would eschew Best Buy for the greener pastures of mail order, and customer service.

I have been in Best Buys since that time, many times because someone else I happened to be with was wishing to go there. Each time I was extremely disappointed at what was passing for service (most of the time there was none), and I wondered why it took so many people to keep people from stealing things, as that could be the only reason for their being posted there.

The point I touched on about price is the one which is usually the most important to the majority of people, but those that have done as much work on computers as I have, know that there is solid value in being able to quickly repair a problem, and so immediate replacement from a brick-and-mortar store means that we’re usually willing to spend a bit more, especially if we don’t have complete confidence in a product because it is new, or proven to be of variable reliability. In case some might wonder, there are times when as a systems integrator, you allow the customer to obtain something that you would never use, but for the reasons only they know, they are insistent upon using it. When the product fails inside 90 days, it is very nice to be able to quickly grab another and get them up and running in short order. (While you replace the item, you gently remind them that you warned against the usage, and tell them that not only might they take your advice next time around, but that if the item you are installing goes bad again, after 90 more days, it might just be awhile before the replacement comes. You do this gently, always, and never let the “I told you so” tone come into the conversation.)

But back to the power of deduction – if the people at Best Buy were better at deducing that many people are tired of being ignored, tired of hearing how the store has the best prices (which, even compared to other brick-and-mortars, they do not), and how you could be best served by letting the Geek Squad do a spring cleaning on your wallet remedy your problems, they might:  get a crew that was helpful, and at the very least attentive; lower their prices below retail on everything, and on things easily shopped by the average person, lower a bit more; and try to realize that the prices of Geek Squad, and their spiel about it, are not well tolerated by most customers. Those who wish to use Geek Squad will certainly come, but they will ask, or be responsive to gentle questioning about it, instead of the velvet hammer routine usually employed. They would come to the idea that those were the reasons why the profit margins are shrinking, more than the problems of the economy are causing, because people are hard to separate from their electronics, and they will sacrifice a lot to be able to afford the latest incarnation of whatever they really want. (By the way, the use of the velvet hammer is usually by the cashiers, who are the only ones in the store who pay any attention to the bodies milling about, and the pursuing of the Geek Squad’s assistance must be a condition of their employment, as they all seem to push as though their job depended upon it.)

As someone that has outlived many chains, I can say that most of them forgot that, while the customer is not always right, they frequently are. Also, they are always the ones paying the salaries of those working, so right or not, they had better be taken seriously.

With a small bit of attention to customer service, and a bit of price pruning, even someone such as I, having been too many times disgusted with my visits to those stores, might give them another chance.


Does Microsoft Have Its Own Tick-Tock Strategy?

The followers of technology are fully aware that computer chip giant Intel has a strategy for its processors, which they describe as Tick-Tock. In short, it is a biennial plan where in the Tick, the process size is taken down a notch, yielding faster processors which use less energy, and the Tock is the next part, where the architecture is changed, allowing for improvements in basic design. The idea is a brilliant one, as the design team can work on the architecture for a much longer period, and the team involved with shrinking the process has that much more time, yet there is constant, year-to-year innovation.

It’s like the very best Gantt chart you could come up with for microelectronics innovation.

Lately (and by that I mean yesterday and today), I have been doing much reading about the changes found in the leaks of the alpha code of Windows 8, including the rumors as well as the verified stuff, and thinking that perhaps Microsoft has actually learned something about itself, and in doing so, come across its own tick-tock strategy of sorts.

With Windows 7, not much was really changed under the skin, the basics are not identical to Vista, but they are much closer to Vista than to anything previous, and the largest changes were on the UI, making something that most everyone approached during the extended beta period seemed to like. That made the changes flow and the time table more fluid, and less worrisome for the clock watchers.

In the things which are leaking from the various sites, and the rumor mills where the early betas have been leaked, it is clear that less is changing with the UI and more with the underpinnings. Things already spoken of, like hybrid boot, are clearly aimed at the improvement of performance, and have nothing to do with looks.

The new task manager, from early appearances, is designed to be easier to use, but also make the average user that can read with understanding into more of a power user, having more fine control over the actions of the machine, without a great deal of study, or interest in Microsoft-speak. This can only be a good thing, and will also affect performance, and though the look is different, it is for effective use, and not style, that the look is changing.

Unfortunately, not all of the new underpinnings appear to be of the helpful variety. There appears to be a new part of the Windows system devoted to the installation being Genuine, and you just know that is not a good thing. Lest anyone get the wrong idea, I say that as a victim of Windows Genuine Advantage and other system checks going completely wrong, when in fact my system was perfectly valid and authentic.

My worst problem was when I was working on my dual-core Phenom II machine that I was trying to turn into a triple or quad core machine. This was during the first week of operation, and also during the week of October 22, 2009. I had installed my Windows 7 Ultimate Steve Ballmer Signature 64 bit edition of the Microsoft operating system, and when the effort to unlock the third and fourth cores went awry, my system was telling me that my system was not genuine, putting up a lot of fuss while I worked to return things to normal.  I was fully expecting a visit from Mr. Ballmer and the thought police. If I, someone with lots of experience with WGA and Windows in general was having problems, imagine what Joe Average would have experienced.

With the move to an even more apparent effort at keeping the system under the Microsoft thumb, it remains to be seen how the Genuine Center may foul the otherwise smooth waters of Windows 8.

In all, it really does look as though Microsoft, unable to capably cope with the schedule that was once deemed appropriate – every 3 years a major release – has opted to change the guts every other update cycle, while changing the cosmetics every other alternating cycle, allowing both the core and GUI teams to work at a pace which is more comfortable, and allowing Microsoft to have the every 3 year influx of dollars it wishes to enjoy.


Comcast 105 Megabit Service – Is It Worth It?

The news has come that Comcast wants to be the big boy on the block, with offerings of the fastest service for general use, at 105 megabits per second, called Xfinity (no one said they were good at naming conventions). That is very fast indeed, and makes things very quick for those that have the service.

The service is still greatly asymmetric, with upload speeds capped at 10 Mb/s, so anyone wishing to run a busy server will still want another offering. No doubt running a server is specifically disallowed, its barring most certainly found in the small print of the contract one must sign when one obtains the service. It always seems to be that way – running a server always incurs business rates, which is just another way to gouge the customer, though they might actually use less bandwidth than someone else doing heavy downloading as part of a busy household.

The service will only be available in large cities for now (listed cities are all of populations over 200K), with other cities added eventually. So, if you were counting on this to be your salvation from dial-up in West Fencepost, Wyoming, you’re going to have to wait a bit longer.

The service, at 105 Mb/s will allow for many things to be done, but the price is fairly dear, at $105 per month for the first 12 months, but after that, the price zooms to $200 per month. The insult which is added to that injury is the fact that the caps which might be expected to evaporate with such an expense, are still remaining. 250 GB per month can go quickly at 105 Mb/s! For those unable to imagine this, it could come in about 5 hours – though that would take a coordinated usage of the pipe. Still, that is not unreasonable in many households these days.

Can you imagine using up your month’s data allowance in less than a week? I know I can with the internet junkies in my house!

A bit more about the service, one must also have Comcast as their television and phone provider, because the subscription to the “Triple Play” is what allows the reduced price of $105 per month. Without the phone and television, it is still available, but that price has not yet been released. A good bet would be that it hovers at that $200 mark.

So will many bite? I’m certain that some will, just to be able to show off for their friends, but the lack of caps being lifted, along with the phone and television usage adding to the data transfers, and the fact that recent news of cable companies having problems with accurate measurements on metered services, all leads to the suspicion of $105 per month not being the end of it – and who wants that?

Worrying about one’s usage, or fighting about the mis-measurement of that usage, is not something that is Comcastic!

The way that these new and explorative services are always priced comes under the “whatever the traffic will bear” banner, and for the good of all, people should not bear it – opting instead to have Comcast either drop the pricing, or drop the caps. I’m certain either would have the customers flocking to the service in droves.


OpenOffice Without Oracle? Is That Like A Fish Without A Bicycle?

When I was a little boy, there was a feminist joke about a woman without a man being like a fish without a bicycle, and the same logic may apply to the OpenOffice group. There may be absolutely no reason for the continuation of the association, and all that Oracle money may not make one iota of difference to the development process.

The whys about that, for those that have not followed the saga of OpenOffice since the sale of Sun to Oracle, may not be readily apparent.

It begins with the disclosure that all those within the OpenOffice group were not happy with the sale of Sun to Oracle before the deal was ever finalized, and that many were secretly wishing that IBM had gotten the nod instead. With the announcement of the acquisition by Oracle, OpenOffice personnel were extricating themselves from the upcoming perceived mess, and in doing so, created The Document Foundation, with its baby, LibreOffice.

There is also the problem of OpenOffice becoming a commercial product, as Oracle went through a period of wanting to monetize absolutely everything, going so far as to ask ridiculous sums of money for Open Document Format plug-ins, which had begun their lives as free downloads. That idea has gone the way of the dodo however, as Oracle has scrapped all the plans for that, including the cloud office portions, which were, for a time, being heavily promoted. Several longstanding URLs dedicated to the promotion of the commercial Oracle Office product, which was to be completely rewritten using proprietary JavaFX, have gone completely 404.

The Register, where this news originally came from, offers no explanation for the outwardly sudden change of direction, and wonders if it was a religious conversion –

Oracle offered no reason for its sudden change on Friday. Oracle may well have had a Saul-like road-to-Damascus conversion to the principles of open source. Sources close to the company have been telling us lately that Oracle has realized it has taken needless lumps for its actions on open source and Java, and is learning how to work with the open source projects it inherited from Sun.

With the long held values, and direction of Oracle, shown clearly by Mr. Ellison, that sort of change is doubtful. It is much more likely that Oracle wishes to cut its losses, seeing that the remaining OpenOffice crew may have been near a mutiny, or perhaps simply realizing that Oracle’s fate was not expressly tied to being the everything-to-everyone corporation. (Microsoft already has those delusions of grandeur.)

While The Document Foundation appears to be doing nicely on its own, the continued work of that body is not absolutely certain, as all the monetary streams flowing inwards are not known. OpenOffice was always dependent on the largesse of Sun, and it is uncertain how long the new, lighter OpenOffice team, with or without a recombination with their mates that formed The Document Foundation, can survive in the world without a sugar daddy.

Will OpenOffice strike out on its own, alone? Are the ties to Oracle simply being loosened, and we are misinterpreting the words of the Oracle spokesperson? Or will OpenOffice and LibreOffice recombine to become a more formidable product, where the adrenalin rush of the recombination pushes them to new heights?


Native HTML5, And Other Microsoft B(asic) S(trategies)

Microsoft can say pretty much whatever they wish when it comes to Windows operating system technologies, because no one else produces a Windows workalike operating system. When it comes to browser technology, however, they should stick to things they can defend completely or explain with little trouble.

When one of the chief purveyors of the current Microsoft browser line spoke in a speech about native HTML5, as a Microsoft domain, I’m certain that no one really knew what the hell he was speaking of, but no doubt the fanboys were about ready to wet themselves, as a new Microsoft neologism was about to begin life. Unfortunately, the developers at Mozilla and Opera got wind of this B(asic) S(trategy), and called foul, as it was clear that all cleverness aside, the speech was full of nothing but the hottest of hot air.

When someone takes it upon himself to coin a word or phrase, it is important that they give a clear definition of what it is they are communicating at the outset. Dean Hachamovitch of Microsoft did nothing like that, preferring to let those in attendance infer what they might from his words. The people from Mozilla and Opera must have gotten the exact transcript, and when no clarity was provided, they decided to give the speech the full recognition it was due, which began with its classification as top-grade fertilizer.

Though Hachamovitch was trying to give the impression that the Windows 7, IE9, HTML5 combination was capable of some sort of synergistic effect, he was far too clumsy in his delivery, and sounded like the current head of Microsoft, another person frequently at a loss for precision of speech, though never at a loss for words. It seems to be the post-Gates Microsoft way.

While the Mozilla folks were very acerbic in their statements, which were as free flowing as the original remarks from the minister from the church of Microsoft.

PC World in its coverage gives a few of the witticisms, but perhaps the most on point is the following  –

“I’m pretty sure Firefox 5 has ‘complete native HTML5’ support,” said Asa Dotzler , Mozilla’s director of community development. “We should resolve this as fixed and be sure to let the world know we beat Microsoft to shipping *complete* native HTML5.”

Word from the Opera camp was a bit more sedate, but still must have taken at least one of the leaders aback, as he is co-author of a book called Introducing HTML5 . The author then stated that the beauty of the web is that there is no one best platform, or as the Microsoft orator put it, native choice. So many things work with it and each delivers what is needed to those using the specific hardware/software combination.

We might hope that this sound drubbing might get Microsoft to speak more explicitly and exactly, with less of their own flourishes, known to many as FUD. Looking at the next flourish from Microsoft will tell the tale, as we will see either more cake with less icing, or it will be back to the same old phrasing, guaranteed to irritate and aggravate those outside of the halls of Redmond.



With USB 3.0, AMD Once Again Proves Its Worth To The Little Guy

USB 3.0 might be the invention of Intel, but try to find native USB 3.0 on any Intel motherboard.

You won’t.

Right now, there are no motherboards which include USB 3.0 ports as part of the core chipset. The very first ones which will include USB 3.0 are coming from rival AMD, and will be found on their chipsets which use the combined CPU+GPU processors, which AMD aptly names Advanced Processing Units [APU].

Intel may not be pushing the USB 3.0 specification on its motherboards, because of LightPeak Thunderbolt, but that specification will need some time to develop connecting devices, and plant itself into the minds of the user base. USB 3.0, on the other hand, is a step up in speed for a public that knows exactly why USB is a good thing, and how to use it. There is also the number of USB 2.0 and 1.1 devices which will hook up immediately to USB 3.0 ports, whether they are on a motherboard or connected to a hub – there is no analog in the world of Thunderbolt. All of those connections will be new and different, save for perhaps a few very lonely Apple devices.

With this, the number of devices which support USB 3.0 will increase drastically, and their prices will drop – by small amounts at first – but they will be dropping. That means lots of USB 3.0 cables, external hard drive cases, and other SuperSpeed peripherals.

“AMD Fusion Controller Hubs will provide competitive performance while consuming low power with active USB 3.0 traffic for high definition video and fast connectivity with the latest SuperSpeed USB devices.”

[PC Magazine]

It can be argued that many things don’t need the increased speed, but as with other designs that came before, the availability will bring speed as the device manufacturers see that it is available.

Certainly video will benefit, especially HD video, and for those who wish to debate the USB – Firewire thing, it can be moved to another realm, as SuperSpeed USB does not have the intelligence of Firewire 800, but it has the raw speed. In many cases, that is all that is needed, because not many people, or devices, take advantage of the daisy-chaining capabilities of Firewire. Firewire continues to be a professional and prosumer standard, while USB is what Average Joe uses, and it works well for him.

If this development continues to mirror the older USB standards, it will be mid-2012 before we see many USB 3.0 devices and cables on shelves, but when they do hit, they will hit in a big way, and USB 2.0 stuff will dwindle rather quickly.

I await the purchase of my first Belkin USB 3.0 7-port (or greater) hub, for around $50…


Repeated Mistakes Revisited – Internet Explorer 10 Will Not Install On Vista

If this sounds like something you have heard before, you are right. This time however, Microsoft may just get away with it, as the numbers of people using Windows Vista are much smaller than those running Windows XP who were told that Internet Explorer 9 would not be an upgrade for them.

Microsoft continues to show amazing hubris, as the developers believe that their browser, which arrived last month, performing in the middle of the pack, is something that users of their operating systems cannot do without. That was something never established, and will hardly become the norm in the time of so many other, better choices.

Only the Microsoft zealots and fanboys will find this to be a problem – and they may not, for the ability to run IE10, when it arrives, may be as simple as doctoring a few lines of code in the installer, as it was with running the recently purchased Windows Defender, on Windows 2000 operating system versions, before Microsoft even bothered to remove the last traces of the marks from Giant Software. Microsoft had notice that Defender would not run, but with the changes to the installer, it was possible with no further attention.

After all, there have been no major changes to the underpinnings of the system from Vista to Windows 7, the things on the surface looking like the work of a adequate mechanic, who wishes to turn his late model Chevrolet –equivalent into a Cadillac Escalade. The looks are there, but a closer inspection reveals the origins as being different from the exterior clues.

There may be no move in that direction, however, as there are so many other choices that the extra effort involved to obtain the use of a browser, yet unproven and without pedigree, may not be undertaken.

Only when Microsoft brings in a browser that debuts at the top of the charts will there be a wish for the masses to use it. When it is fast, easy to use, and brings something to the table that no other browser does, it will become necessary. Until that time it will take more than a blue lower case “e” to get people excited.

Keeping those happy with Windows Vista from using the new browser will only anger many, for they see that Microsoft wishes to push upgrades on those not wanting them, and thus more scrutiny will come to the fact that Microsoft has yet to get the last niggling bugs out of anything they produce, and that the repair for long standing bugs is always shown as the paid upgrade.

As someone stated on another site, it looks as though IE9 is the new IE6. Microsoft has done well by keeping versions of its main cash cow, Office, compatible with as wide a number of things as possible; it might follow that they would do the same with anything else that had become a major effort.

Clarity in thinking and continuing consistency should be the hallmarks of any large corporation, but Mr. Ballmer has not yet learned that lesson.



Land Of The Lost – The 23” Tablet Computer

Sometimes the ability to be amazed is the thing you miss most when you work with things for any length of time. The many things which come from the (fertile?)  minds of the developers of computer hardware is an area where I thought I could no longer be surprised.

I was totally wrong.

This morning it has been announced that Lenovo has decided that their line up will not be complete without a 23” tablet offering.

Though there may be those who will do more than think twice about this upcoming product, I wonder where the cries of foul will come from first – those who do not want to lug the unwieldy device around, or those who complain that the device is too easily damaged, as it will no doubt be bent to the point of breakage by many early adopters.

Actually, instead of a tablet profile at 23”, why not a notebook of 23” instead? A laptop which would feature a couple of things different from those currently available, and probably seem absolutely atavistic to the many who must have the very smallest, lightest thing available.

I’d like to see a full keyboard, with no Chiclets keys, but instead having full travel keys which are protected by an outer ridge which moves away from the lower half of the laptop when opened.

An optical drive may be considered passé by some, but those feeling that way are hardly working from usability studies, as there is lots of life and many reasons why optical drives will be around for a long time. Including one which is a bit more durable, and includes LightScribe, should be a design point.

With the larger size available for the lower half of the clamshell, a much more feature replete touchpad could be included, allowing multi-touch gestures without the added expense, and questionable worth, of an actual touch enabled screen. That would make usability soar, while keeping those who currently feel the need for an adjunct mouse from having to carry around that extra piece of equipment – the better touch pad would obviate any need for the multi-button mouse with its own multiple buttons available, and multi-finger gesturing allowing for precise control. (Anyone who has used the Adesso 4 button track pads knows how handy those buttons are, and how easily a person can become accustomed to their use.)

With the real estate allowed by the larger form, a combination of increased CPU efficiency and increased space within the confines of the laptop outline could make for a much larger battery compartment, which could allow usage away from the power mains for a much longer time than is currently the norm.

With the increased screen size, allowing for 1920×1080 resolution, the vastly improved keyboard and touch pads, the increased battery life, and better build quality (using a monocoque design), the larger notebook might actually have a chance at making the masses give up on the old desktop machine, which so many find difficult to integrate into their home decor.


Another One Bites The Dust

After a long run, some will be quick to say too long, the first internet browser branded as being social will see an end to support on the 26th of April.

Flock, once heralded as the game changer has changed no games, and has lost much of its initial thrust, with the many additional items which can be added to almost any other browser, giving all the benefits, and allowing many other benefits developed for browsers increasing their usefulness in other ways.

The original Flock was built using the code of Mozilla, and was Firefox with a few additions allowing the proclamation of the birth of social browsing. That was just 6 years ago yesterday. A long time in internet time, but a very short life for anything else.

Many will blame the demise of the browser on the switch to the Chromium code last summer, and while that may have stalled any improvements in the social aspects of the browser, it certainly made worthwhile improvements in all other areas, so overall, most would consider it a wash.

The actual change to Chromium code likely did nothing but help the browser last a bit longer, as the ease of coding extensions, and the capacious extension vault no doubt made many users very happy, as they were able to add on without the extreme slowing of the browser that is the signature of the Firefox codebase.

The development of Rockmelt, by Marc Andreessen, may also have pushed the Flock product to the side, as the higher profile of any Andreessen project would lead to lots of viewings of the Rockmelt initial offering, and subsequent updates.

In the end, we may never really know why it is that Flock, as a continuing project, will cease to exist on the 26th of this month. The page where one would go to download the product does not even offer that choice any longer, instead telling those looking, to choose either the newest Firefox or Chrome browsers, for the fact that they will continue to be supported.

While the concern for the user’s online safety is touching, it is very odd that the browser is not still able to be easily downloaded, and makes this writer wonder how many unplugged problems might have just been discovered, with no manpower to adapt code, or to bring the code up to snuff for the short term.

Unless I have gotten my time lines wrong, I believe this makes Flock the very shortest browser, in terms of overall lifespan, which may be something that, in and of itself, will assure its mark in the internet history accounts.

The FAQ gives some interesting answers to questions many will no doubt ask.



Which End Of The Spectrum Is More Exciting?

These days, it is an amazing thing to look around and see that massive work is being done at both ends of the PC spectrum – the top end, where Intel dominates with their i7 processors, and the extreme bottom end, with their newest Atom platform, code-named Oak Trail.

Intel is determined to own both ends of the market, and the changes at the low end may be more important than those at the top, for if Intel can push enough performance from the Oak Trail platform, while being parsimonious with the energy expenditures, it is just minutely possible that the company can keep the x86 platform in the spotlight, and keep the efforts of the ARM producers in the background, wondering why their plan didn’t work.

Oak Trail is an amazing leap from the original Atom offerings, with the Atom Z670 processor and SM35 Express Chipset, the newest platform consumes less power than its predecessor, it can decode 1080p video, has Flash support and can be paired up with multiple software platforms, including Windows, Android and MeeGo. The Z670 is clocked at 1.5 GHz, and features GMA 600 integrated graphics, a built-in memory controller, and has 512kB of L2 cache.

Of course, for the extreme low power notebook market, there is the Cedar Trail platform, which has all that Oak Trail includes, but adds better support for Blu-ray, including all the additions of the 2.0 specification, and yet to be fully fleshed out offerings like Intel Wireless Display, Intel Wireless Music, PCSynch, and FastBoot.

All of these things will make high quality, small form factor PCs a reality for many, with the ability to carry all one needs easily, and power it for hours without worry of where the next AC outlet is located. If this is sounding like a challenge to iPad2, you’re right. The iPad2 may still win out on overall frugality with power, but what can be done on the Intel products will be much more to the liking of anyone that currently works with a standard PC, requiring much less adaptation, and only slightly more power.

There has yet to be any answer from AMD in this market space, as nothing they have yet announced, or has been leaked, will exist at such a low power state. With the market beginning to move in that direction, including all the Android devices, as well as the iPad which Apple is hoping to make ubiquitous, it only is reasonable to think that AMD will be bringing something to this segment, yet nothing approaches the power envelope of these newest Intel products.

While many may still have their gaze fixed on the i7 and the support hardware, which breathes fire and chews up corporate spreadsheet calculations like midnight snacks, the lower power segment is going to be much more important to most, just as the frugal and fun Mini Cooper is a more attainable goal than any of the creations of Enzo Ferrari are to the driving crowd.


Internet Explorer 9 Rises To 4% Usage On Windows 7, Should We Be Impressed?

The idea that Microsoft will use any opportunity to crow about something proves that the company has lost much of its focus, because back in the old days, when Mr. Gates was at the helm, 4% of anything was nothing that anyone would get excited about, and work would continue to make strides to significant gains in a field where something would matter.

Just because there is a new version of Internet Explorer is not a reason to believe that it is something worthwhile – it is not. The browser exploded onto the scene with a thud, not ever occupying the top spot by anyone’s estimation. Being solidly mediocre is hardly an achievement.

But the company must crow about something, because the fact is, the name Microsoft must be on the lips of many, or things are not right in the Microsoft world. Because the Xbox is doing nothing out of the ordinary right now, and the Kinect wave is beyond the crest, it is necessary to speak of things like Windows 7 passing usage share of Windows XP, but only in the United States, and only using a less than scientifically reliable method of determination. (Statcounter’s methods and measurements have been questioned in the past over many other measurements…).

So, as I had said, Internet Explorer 9 would be downloaded by many, only to be used as a method of last resort, once in place on the hard drives of Windows Vista and Windows XP users. The fact that the claims from Statcounter are 31.x% usage for Windows 7, and Vista at 19.x% usage, a measly 4% usage is poor indeed, as if there really are 50% of the personal computers that don’t bear a fruit symbol able to use IE9, and only 4% are actually making use of the browser, it does not speak well for its performance or abilities to engender an audience.

Actually, it is only 3.6% adoption, if you look closely, and that is even worse. But in usual Microsoft fashion, the clues have already been dropped concerning the upcoming Internet Explorer 10. The only thing good about that is that sites will definitely need to have proper testing and identification for two-digit browser identifiers. Opera and Chrome browsers will be properly identified by default.

It’s the Microsoft way. When something is mediocre and less than popular, the FUD gets spread, and the rumors begin, and the anticipation grows. All the way until the letdown of the actual release.


Opera 11.10 Near Release Final As RCs Come At A Furious Pace

The bugs are dying at an incredible pace in Norway today, as the Opera Desktop Team has put forth another release candidate, these last few coming with point numbers! The latest, and 2nd for this very long spring day, is RC 4.1, which brings lots of presentation bugs to an end.

Pages which were presented much differently in Iron, or Internet Explorer 8, are now mostly the same, with only minor variations. As to which is truly correct, we’ll have to wait for the Acid testing. One of the things I have noticed with Opera 11, since version 10, is that some pages incorrectly render as if it were a mobile browser – mycokerewards is one of them. I am thinking that this is because the page does not correctly interpret 2 digit versioning, and we’ll probably not see much change on this until we get Internet Explorer 10, which will force some of the lazy sites to come into some sort of compliance.

I’ve always maintained that if all sites were coded correctly, the bad actors, like Internet Exploder 6, would disappear at near light speed.


DSK-333655 Opera crashes when you cancel a Unite download

DSK-333661 Missing close button in notification popup

DSK-333313 Second column not created when tabs are set to wrap and tab bar is to right/left

DSK-333669 CSS parser crash

DSK-333662 [Mac] Fix for missing extensions for certain file types

CORE-37911 [Mac]Hidden plugins on Mac don’t initialize

Above are the latest noted changes, but as the site declares, there are many other little things fixed which were niggling problems for so long. The speeddial page is usable now, though a bit of futzing with the parameters is necessary – the way to get non-standard layouts is different than before, as there is no need, or effect, to manually editing the ini file.

There is more speed in page renders than with 11.01, and the general direction of the efforts shows that the guys on the development team are still pushing for the very best browser, as judged by those not otherwise predisposed to long held prejudices.

The download is worthwhile, and very stable. It may only be a day or so until the final product is released, but until then, this revision, 2092 [Windows], is a big step up from 11.01 Stable, and worth the time to remove 11.01 and install 11.10 Beta 2092. The long standing ability of installing over older versions is no longer recommended by the Team, as some of the ini files and other areas where vital information is stored have changed, making overwritten installs do strange things.



Canonical Terminates Free CD Disbursement – Will It Affect Ubuntu’s Usage Numbers?

The free ride for many, getting a shiny new CD with the biennial Ubuntu distribution for inclusion on their machines is now gone, as Canonical has announced that the ShipIt program is officially over.

The program was a great idea in 2005, when broadband penetration was not significant, and the bulk mail rates were more appealing, but now, with everyone’s costs rising, Canonical must put the money where it can be best used, and distribution of CDs is not considered to be where that is. Today, it is easy to obtain a copy by download, if not at your own location, then by download at a public library, or college, where broadband capabilities make even the most stingy of allotted times at the library usable to obtain the 700 MB needed for the image. In those locations where no access to an optical drive to burn the CD is available, it is easy enough to connect a flash drive to an open USB port, and then make the copy at one’s home. It is also possible to leave the download on the flash drive, and use it as the point of entry for many newer machines. If your machine is not one allowing boot from a USB-attached peripheral, there are still other methods available. Remember, necessity is the mother of invention, and we can be very inventive when needs be.

The newest version of Ubuntu, 11.04, will not be able to be obtained by the ShipIt method, but though Canonical is not doing the disbursement directly, various local communities for Ubuntu will still be shipping discs from their various locations. In the United States alone, there are LoCos (their moniker, not mine) in Arizona, California, Colorado, Illinois, Washington, D.C., Florida, Georgia, Indiana, Kentucky, Maryland, Massachusetts, Michigan, New Jersey, New Mexico, New York, North Carolina, Ohio, Pennsylvania, and Utah. The list is located here, and tells who needs to be contacted to get your very own disc – what you can’t do any longer is obtain a slew of them, for your own distribution.

Those truly interested in furthering the efforts of open source (and who isn’t?) should also be reminded that one can still obtain those discs from the Canonical store, at a very small price which covers media costs as well as a small amount to continue development efforts. Mark Shuttleworth is no Rupert Murdoch!

So though this announcement precludes some from obtaining many discs, it is still easy to obtain each release from a local unit of the Ubuntu family. It is not known whether the discs will include the cardboard sleeve with graphics that the discs from the Ubuntu mothership did, but the content is there, nonetheless.

While the one large program ends, Canonical is speaking of a new one beginning, with the possibility of a testing environment in the cloud coming this year.

Though it will no doubt raise the ire of those Ubuntu fans reading this, I might suggest a more complete (and up-to-date) set of man pages be a part of the newest distributions, as well as much better interactive help in the OS itself. To my way of thinking, making help for problems even closer than a possibly disconnected internet would do the very most to help adoptions rise.