Welcome!

Java IoT Authors: Liz McMillan, Elizabeth White, Yeshim Deniz, Zakia Bouachraoui, Pat Romanski

Related Topics: @DevOpsSummit, Microservices Expo, Containers Expo Blog

@DevOpsSummit: Blog Post

How to Ease API Testing Constraints | @DevOpsSummit [#API #DevOps]

The top API testing issues that organizations encounter and how automation and a DevOps team approach can address them

Ensuring API integrity is difficult in today's complex application cloud, on-premises and hybrid environment scenarios. In this interview with TechTarget, Parasoft solution architect manager Spencer Debrosse shares his experiences about the top API testing issues that organizations encounter and how automation and a DevOps team approach can address them.

The following is an excerpt from that interview...

What makes testing APIs challenging?
When you're building an application, you're not just using your own APIs or your own internal applications. Instead, you have to rely on a wide variety of endpoints and APIs and databases. We see lots of industry-specific, third-party API integration. For example, in the hospitality and airline industry, Sabre is common; in retail, credit card/address API verification is common.

If I integrate with Facebook or integrate with other applications, how can I tell if those APIs are in the state that I need them to be in, are available on my release schedule and are going to be functioning the way that I need?

That's really why availability's a constant problem, because we have all these pieces that are moving. Developers, as well as testers and QA architects, need to get all those pieces in sync to optimize their release schedule.

How does a business's organizational structure hamper API testing?
Access to internal resources can be a challenge. Frequently, all of these resources are controlled and managed by different groups. If I'm a developer building an application, I will work in an environment containing lots of groups that I rely on. It's not just my own development effort. These internal resources may be unreliable or I may have little control over them. Many financial organizations face internal testing bottlenecks associated with mainframe access, for example.

In another example, as a developer, I may rely on a database maintained by a DBA on a separate team. Or I could rely on an API maintained by a different group of developers (and these developers may or may not be part of my organization). This disconnect between who needs an API for testing and who controls the API means that my test environment will commonly be a bottleneck in my development or QA process.

Is testing APIs more difficult from the availability standpoint than testing software, or are there no differences?

The increased focus on mobile development and interconnectivity of applications means that testing in just about any application development project will rely heavily on API integration. So, more API testing is being done than in the past, and that adds another layer of work for quality assurance teams. Otherwise, there is little difference in resource availability problems for API and application testers in modern development.

How are development organizations addressing this API test problem?

From a process standpoint, they're using DevOps to provide more collaboration and fewer constraints for API testers. DevOps, in particular, facilitates "shifting left", where testing is done earlier...

***

The complete article continues to discuss:

  • What test-driven development means from a business perspective
  • What technologies are needed to reduce pressure on API testers
  • Why and how to perform a large number of validations very quickly at an early stage of the SDLC
  • Tips for capturing the data needed to make intelligent business decisions.

You can read the complete Ways to Ease API Testing Constraints article here (no registration required).

API Testing Resource Center
To access a host of API testing resources that can help you better understand and apply API testing best practices, see Parasoft's API Testing Resource Center. Here, you'll find resources such as:

About Parasoft API Testing
Parasoft's API Testing solution is widely recognized as the leading enterprise-grade solution for API testing and API integrity. Thoroughly test composite applications with robust support for REST and web services, plus an industry-leading 120+ protocols/message types.

Why choose Parasoft for API Testing?

  • Industry leader since 2002
  • Recognized for "ease for use", intuitive interface
  • Advanced intelligent automated test generation
  • Extensive protocol and technology support
  • End-to-end testing across multiple endpoints (services, ESBs, databases, mainframes, web UI, ERPs...)
  • Designed to support continuous testing

More Stories By Cynthia Dunlop

Cynthia Dunlop, Lead Content Strategist/Writer at Tricentis, writes about software testing and the SDLC—specializing in continuous testing, functional/API testing, DevOps, Agile, and service virtualization. She has written articles for publications including SD Times, Stickyminds, InfoQ, ComputerWorld, IEEE Computer, and Dr. Dobb's Journal. She also co-authored and ghostwritten several books on software development and testing for Wiley and Wiley-IEEE Press. Dunlop holds a BA from UCLA and an MA from Washington State University.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...