BDD Cross-Platform Applications, the Technical Setup

We started our journey into Behaviour Driven Development and automated functional testing nearly three years ago. And it’s definitely been a very interesting journey.

Despite its potentially intimidating name, BDD is an amazingly simple tool for the whole team. It’s a methodology to develop software with a focus on collaboration, particularly with less technical people. BDD tests create a shared language across disciplines to describe how systems or applications work. The other main benefit of BDD is automated testing, ie, the tests are executed by a machine, liberating testers of the tedious work of having to manually repeat a test script over and over again.

The adoption of ‘proper’ BDD at ustwo varies depending on teams and clients, but overall the results have been very positive.

Collaboration and Testing

In terms of collaboration, the reward has been to prevent mistakes from happening by sitting the team together to discuss the work at hand before designing or coding. Using concrete examples to describe behaviour removes ambiguity.

On the test automation side of things, the reward has been rock-solid applications as a result of a) catching bugs early in the development process and b) increasing the testability of our code.

To make the most of the collaboration sessions we ask designers and developers to avoid working ahead of time, because we have seen time and again that once people see a solution (in the form of a visual interface or code) they have a hard time not limiting themselves to describing it during BDD sessions.

Design and code together

Once you’ve seen it, if you then try to explain how it works, it’s very hard to avoid describing what you've seen – a bit like in Inception. You think about form before function and this limits the team’s ability to explore other solutions. If you describe it first, however, (“should display user’s tier, name, number of points, and extra links”), there’s still plenty of freedom to come up with an interface that fits the behaviour requirements.

Stopping design and code getting ahead of each other has definitely had an impact on the way we work, and some teams and clients have adapted better to new ways of working.

Another big hurdle is the relatively complex technical setup. We are still tweaking some of the specifics here and there, but with a few BDD’ed projects under our belt we have come to a certain stability around the main concepts:

  1. The apps for each platform solve the same business problems, hence the interface should be reasonably similar.
  2. Despite 1), we must be able to accommodate platform-specific differences to stay close to platform interaction best practices.
  3. Cucumber as our BDD tool of choice.
  4. Declarative tests and page objects.
  5. A local mock server that enables us to test our applications under extreme conditions and edge cases.

While none of these concepts are difficult on their own, when you combine them all, the size and complexity of the testing code base is considerable. This requires capable technical testers to be comfortable writing, growing and maintaining automation code. Also, the tools to do test automation for mobile devices are not as stable as their web counterparts, sometimes decreasing the overall confidence in the testing suite.

BDD Technical Setup

To help with the technical setup we have been working over the last few months on a sample reference project that demonstrates all these concepts and that we would like to share with the wider community for discussion and feedback. The code is on GitHub where there is also documentation, particularly around our choices and trade-offs.

It is a project to use as a reference for what underpins the BDD process from a technical point of view, so we are calling developers and testers to get involved. The points of view of people with and without previous BDD experience are both interesting, for different reasons.

So yeah, hop-on to GitHub, email or get on the BDD channel in SlackLine.

Thank you!