Skip to main content

https://beisdigital.blog.gov.uk/2013/05/01/updating-and-simplifying-consumer-rights/

Updating and simplifying consumer rights

Posted by: , Posted on: - Categories: Case study, Engagement, Strategy

In July 2012 BIS launched a consultation on modernising and simplifying consumer law on the supply of goods, services and digital content. As a policy area that affects everyone in the UK, this was a huge task.

In fact that was the problem - it was, in a very literal sense, a very big consultation, running to 101 detailed policy questions spread over a massive 221 pages.

We knew that the core stakeholders – those with a specific interest such as consumer rights groups and large retailers – would respond to the consultation, but we wanted to go wider. What about individual consumers? What about small retail businesses?

The task

Our big idea was to run a parallel online consultation, focussing only on the really key questions so that it was short, sharp and, above all, accessible to a much wider audience. The end result was that we decided to split the consultation – focusing on asking 18 core questions, split over 3 separate surveys (one each on our goods, services and digital content proposals), and housed this on a dedicated microsite built by our colleagues in the digital team on their WordPress platform.  We also produced a ministerial video with the aim of simplifying and boiling the questions down still further.

Goals

Our simple goal in producing the microsite was to increase the number and diversity of respondents to the consultation. Our hope was that this would add valuable additional evidence to aid in the development of our final policies.

How we went about it

The key step was identifying the right questions that were both important and understandable to the average reader, without having to bury them under a mountain of explanatory text.

Once the questions had been settled on, a further important consideration was how best to phrase them. We wanted to make it easy and quick for people to respond, but we obviously didn’t want to be too leading in our questions. In the end we opted for a series of questions with drop-down answers, but with optional follow-up boxes for additional text. The former allowed us to quickly analyse the data to identify general themes, while the latter allowed us to take on board more nuanced views.

Compared to that, the design of the microsite seemed relatively simple from the policy team’s perspective. The three surveys each had their own separate page with links to further explanation (in the form of pop-out windows) and to the full consultation document, so that if the respondent wanted to delve a bit deeper into the rationale, they could easily do so. There was also the option for the public to add comments to each page, separate to the survey questions themselves, and this generated some interesting points that the team were able to respond to directly.

The final design featured an introductory blog from the minister.

Behind the scenes, the survey results were tabulated and could be easily downloaded by the policy team for analysis at any time (allowing us to gain early insight into trends).

In promoting the microsite, we arranged for a blog from the minister to appear on the Which? website, and asked our stakeholders to put links on their websites and in their newsletters, etc. We also tried using Twitter to improve uptake, but this didn’t take off quite as we’d hoped - our hashtag and variations of it managed to gather little if any traction.

What worked well

Increased reach. This consultation wanted to reach beyond the usual stakeholders who replied to consultations.  The response numbers were higher than previous consultations and had greatly increased reach – 66% of individual respondents replied through micro site shorter version of the consultation rather than traditional route.

The number of individual respondents and type of responses reassured us that traditional stakeholder responses were similar to those of consumers

We devolved online responses to a strong and knowledgeable member and the team and the quality of her replies to questions helped increase the depth in evidence.

This micro site was a great example to showcase to policy colleagues how digital engagement and reach more people and help quality of responses.  The Minister for Government Policy (Oliver Letwin) highlighted the micro-site as example of best practice

What worked less well

Although response rates were higher than previous consultations, given the nature of the consultation the team thought a lot more people would respond.  Only 220 actually responded and conversion rates were 5.2% for goods, 3.1% for services and 2% for digital content.  The team had higher expectations and were slightly disappointed that more individual consumers didn’t respond.

More promotion activity could have been carried out to help drive traffic to the site.

What would the team do differently next time?

  • allocate enough resource to help promote the consultation
  • establish relationships in established forums and groups to help generate responses
  • manage internal resource expectation - the daily monitoring and replying to responses, although beneficial, did take time

Impact on staff

Developing the micro site version focused us on the key questions we wanted consumers to respond to. It helped us to think ‘what were the killer questions’ we wanted to focus on which has had the ongoing benefit of aiding with other briefing materials, making them much ‘punchier’.

We now have first hand experience of delivering a consultation in this format and can communicate benefits and limitations to this approach.

We will shortly be entering into a new period of consultation prior to the Bill being introduced, and are developing a digital engagement plan to support this.

Impact on policy

Monitoring daily responses highlighted issues much quicker and gave time for us to investigate and adapt our thinking from an early stage – this was extremely beneficial when working to tight deadlines immediately following the closure of the consultation.

Digital Team Takeaway

  • We were really impressed by the way this team handled the daily monitoring of replies.  They allocated senior, knowledgeable resource to keep track and answer questions and as a result got deeper quality responses as well as tracking trends earlier.
  • This was a really interesting consultation and we missed a trick by not helping the team to carry out sustained promotional activity to increase the number of responses.

Stay up-to-date by signing up for email alerts from this blog.

Sharing and comments

Share this page

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.