Welcome!

Containers Expo Blog Authors: Zakia Bouachraoui, Pat Romanski, Yeshim Deniz, Elizabeth White, Liz McMillan

Related Topics: Microservices Expo, Java IoT, Microsoft Cloud, Containers Expo Blog, Agile Computing, @CloudExpo, @DXWorldExpo, SDN Journal, FinTech Journal

Microservices Expo: Blog Feed Post

Bare Metal Blog: Testing for Numbers or Performance?

What you test can say a lot about you

Along the lines of the first blog in the testing portion of the Bare Metal Blog series, I’d like to talk a bit more about how the testing environment, the device configuration, and the payloads translate into test results.

One of the problems most advanced mass education systems run into is the question of standardized testing. While it is true that you cannot fix what you have not determined is broken, like most things involving people, testing students for specific areas of knowledge does kind of guarantee that those doing the teaching will err on the side of preparing students to take the test rather than to succeed in life. The mere fact that there IS a test changes what is taught. It is of course possible to a make this into a massively positive proposition by targeting the standardized tests at the most important things students need to  learn, but for our discussion purposes, the result is the same – the students will be taught to whatever is on that test first, and all else secondarily.

This is far too often true of vendor product testing also. The mere fact that there will be a test of the equipment, and most high-tech markets being highly competitive, makes things lean toward tweaking the device (or the test) to maximize test performance, in spite of what the real world performance will be.

The current most flagrant problem with testing is a variant on an old theme. Way back when testing the throughput of network switches made sense, there was a lot of “packets per second” testing with no payload. Well, you test the ability of the switch to send packets to the right place, but do not at all test the device in a manner consistent with the real world usage of switches. Today we have a whole slew of similar tests for ADCs. The purpose of an ADC is to load balance, optimize, and if needed secure the passage of packets. Primarily this is for application traffic because they’re Application Delivery Controllers. Yet, application traffic being layer seven kind of means that you need to do some layer seven decision-making if the device is to be tested in the real world. If the packet is a layer seven packet, but layer four switching is all that is performed on it, the test is completely useless to determining the actual capabilities of the device. And yet there is a lot of that type of testing going on out there right now.  It’s time – way past time – to drive testing into the real world for ADCs. Layer seven decision making is much more complex and requires a deep look at the packets in question, meaning that the results will not be nearly as pretty as simple layer four switching packets are. While you cannot do a direct comparison of all of the optional features of two different ADCs simply because the level of optional functionality support is so broad once a solid ADC platform is deployed, but you can test the basic capabilities and responsiveness of the core products.

And that is what we, as an industry must begin to insist on. I use one single oddity in ADC testing here, but every branch of high-tech testing I’ve been involved in over the years – security, network gear, storage, application – all have similar “this is not good enough” testing that we need to demand is dropped in favor of solid testing that reflects a real-world device. Not your real-world device unless you are running the test lab, but a real-world device that is seeing – and more importantly acting upon – data that the device will encounter in an actual network, doing the job it was designed for.

As I mentioned in the last testing installment, you can make an ADC look astounding if your tests don’t actually force it to do anything. For our public testing, we have standards, and offer up our configuration and testing goals on DevCentral. Whether you use it to validate the test results F5 uses, or to set up the tests in your own environment, publicly talking about how testing is performed is a big deal. Ask your vendor for configuration files and testing plan when numbers are tossed at you, make certain you know what they’re testing when they try to impress you with over-the-top performance numbers. In my career, I have seen cases where “double the performance of our nearest competitor” was used publicly and was as close to an outright lie as possible, since the test and configuration were different between the two products the test claimed to compare.

When you buy any form of datacenter equipment, you’re going to be stuck with it for a good long while. Make certain you know how testing that is informing your decision was performed, no matter who did the testing. Independent third party testing sometimes isn’t so independent, and knowing that can make you more cautious when hooking your company with gear you’ll have to live with.

Bare Metal Blog Series:

Read the original blog entry...

More Stories By Don MacVittie

Don MacVittie is founder of Ingrained Technology, A technical advocacy and software development consultancy. He has experience in application development, architecture, infrastructure, technical writing,DevOps, and IT management. MacVittie holds a B.S. in Computer Science from Northern Michigan University, and an M.S. in Computer Science from Nova Southeastern University.

IoT & Smart Cities Stories
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science" is responsible for guiding the technology strategy within Hitachi Vantara for IoT and Analytics. Bill brings a balanced business-technology approach that focuses on business outcomes to drive data, analytics and technology decisions that underpin an organization's digital transformation strategy.
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...