Comparing different technologies

Print Friendly, PDF & Email

OK, let me make one this clear, and most of you know this already, I’m not a fan of FCoE and I wrote numerous articles around this topic. I work for a vendor especially focusing on storage connectivity in a engineering support role. Basically it means I need to know this stuff. When I see people struggling with storage from a connectivity perspective and I come to a point I either meet these people in person or have them on the phone I’m pretty often absolutely flabbergasted around the lack of fairly basic knowledge of Fibre-Channel. The basics of checking port and link errors, the generic concept of flow control and operational management is very often on a pretty low level. Don’t get me wrong, I’m not accusing anyone of being stupid however it turns out that education and self-study w.r.t. storage is not seen as a critical item on the day-to-day IT department shopping list and focus is more leaning towards networking, operating systems and applications.

If, however, your sole job is storage and you are tasked to provide a study between two different sorts of technology both touching the same subject and you’re able to screw up the way the Evaluator Group has done last week I seriously doubt the credibility of such an organization.

The issue is that Brocade had ordered a research project to be conducted by the Evaluator Group to try and determine the pros and cons between FC and FCoE. Obviously if your income mostly depends on FC you’ll make pretty sure the outcome will favour FC in more than one area. This kind of “research” has been done for many years and it was up to the reader to try and bake cake out of these reports which by definition are biased anyway.  The problem that happened with this lab test is that the test engineers of the EG seem to have had absolutely no idea what they were testing in which form or shape. I’ll spare you the technical details. These have been pretty well discussed by Tony Bourke from DataCentre Overlords , J Metz  and Dave Alexander‘s posts to read at your leisure. Even former Evaluator Group analyst and storage author Greg Schulz could not find a conclusive outcome

What I would like to know is that how two highly regarded technology companies with a pretty good track record (each in their field) have been able to produce a report that enabled the entire storage and unified computing savvy blogosphere to burn it down to its socks without either of the two companies having the guts to admit their fault or to provide a response which proves these bloggers wrong. (There has been a little addendum on report where the allegations are explained but to me it looks more as an apology that the test engineers had no clue what they were testing.

The remark that strikes me is:

 

2)    This test does not seem to be the best way to compare only an FCoE connection to a Fibre Channel connection.

The goal of the test was to compare two typical configurations.  Customers are deploying high speed solid-state systems in virtual environments.  Currently, a majority of our enterprise customers prefer to use a Fibre Channel SAN for their high performance applications.  When comparing alternatives, customers often consider a blade server like the HP BladeSystem, with Fibre Channel interfaces.  Alternatively, they may also consider Cisco UCS blade systems with FCoE interfaces. In both cases, it is common to connect the blade servers to a SAN (in this case a Fibre Channel switch), as opposed to directly connecting to FCoE storage.

The issue I have with the above is that you admit you have two totally different systems and yet the conclusion is that FC is superior to FCoE. Very strange.

Another one is:

5)    Evaluator Group seems to penalize the UCS for being more difficult to setup, but maybe they just understand the HP system better.

Our objective was to report on our experience configuring both environments.  While Evaluator Group has many years of experience setting up enterprise servers; we did not have prior specific experience setting up either a HP c7000 BladeServer, or a Cisco UCS.  As explained, all testing and setup was performed within our lab environment, without direct assistance from any of the vendors.  When issues arose, we utilized typical support mechanisms, calling into the toll-free support lines of the vendor, without asking for special assistance.   Our opinion is based on the difficulty and amount of time needed to configure each system. Our experience was that it took us longer to configure the Cisco UCS than the HP system, and we reported that observation.

 

To me this is related to having (or obtaining) knowledge of the platform you are using/testing. You cannot expect a system to dynamically adapt to the specifications you want it for. It’s like comparing a Ferrari to a Formula 1 car. They both are incredibly fast but have distinct differences in driving characteristics. By setting up either car incorrectly you could even have a Suzuki Liana go faster. (yeah yeah, google it..)




Since both the report, as well as the response on the comments, don’t impress me from a technical perspective it strikes me that the resulting document seems to have bypassed any form of technical scrutiny. Normally when you issue a such a report you would like to know the result and throw it in front of your internal guru’s to make sure it sticks and gets validated with a thumbs up. So how is it possible that such a report is published?

I think I know the answer and that relates to marketing.

Basically in a technology driven world marketing plays a huge role for every company. Brand recognition, go-to-market models and timelines depend very often how quick the marketing departments are able to shout out to the world about new, but also existing, products and services. Marketing department are basically divided into 4 segments:

  1. Corporate marketing
  2. Technical marketing
  3. Field marketing
  4. Competitive marketing

Corporate marketing is mostly tasked defining brand strategies, go-to-marked models and providing guidelines for the other marketing departments. Corporate presence on the web and global media also often falls under their responsibilities.

Technical marketing is responsible for creating the data-sheets, and all sorts of technical content for customers, prospects and the (pre-)sales organizations. The tech-marketing people are often drafted from pre-sales organizations and are used for a multitude of evangelizing purposes as well.

Field marketing is primarily tasked with local or regional sales-support. Activities like demand generation, local and regional surveys, organizing events all belong here.

The there is competitive marketing. These people are tasked to keep a close eye on what competitors do and align responses to those technologies and services based upon the companies own portfolio. They are also tasked to provide the sales force information regarding FUD (Fear Uncertainty & Doubt) spread by competitors in order to counter arguments that customer and/or prospects might have seen or heard.

It is my suspicion that the Evaluator Group’s test was ordered by the last. I can’t comment on the technical abilities of this department at Brocade but it seems some steps where missed here. If people from either technical marketing or engineering would have been involved I think the report would not have gotten a thumbs up since too many loose ends were hanging off of it. It were these loose ends that were shot at by all people I mentioned before.

I hope both Brocade and The Evaluator Group have learned from this experience and both do their homework better. There have been multiple examples where such publications have massively backfired on companies.

As a disclosure:

I work for a Brocade OEM in a support engineering role and have no commercial interest in either Brocade or the Evaluator Group.  I have a huge respect for the Brocade engineering organization and I have been using their products since day 1 (yes, I was one of the first to deploy a Brocade 1600 1G switch back in 1998) and I still do.   I just think it is unfair to toggle levers so that one technology shows favourable above another. For me there are are other factors why FC is preferable above FCoE. My arguments against FCoE are well known are have never been refuted by even the biggest proponents of FCoE. As long as FCoE stays confined within a UCS system I’m 100% happy with that but do NOT propagate it to the outside world.

Regards,

Erwin

About Erwin van Londen

Master Technical Analyst at Hitachi Data Systems
Storage Networking , , ,

Comments are closed.