Salta Group
Salta Group

Salta Group has a clear mission: to make lifelong development possible for everyone. Together with its 26 educational organisations, it forms the largest private training organization in the Benelux and annually provides more than 1 million people with knowledge and skills. Salta Group wants to continue to develop, as their mission also conveys. Alex van der Weide (Director of Communication) and Annemarie Jans (Manager Expertise Center Operations) talk about the intention and impact of conducting research into customer needs and their experiences with the CircleLytics dialogue for qualitative deepening and co-creation and move beyond focus groups and surveys.

Jans: “We are interested in the experiences of each student from registration to the moment a course or training is completed. After every course and training, we test standard topics via online surveys in the online learning environment to evaluate whether and how we can improve. Quantitative studies like this give us information, but do not show the underlying reasons for the answer. Then it is difficult to make precise decisions that really matter to them ánd our company. We used to perform annual focus group sessions with students from various study programs to collect qualitative feedback about the entire journey a student takes with us. However, these focus groups are time and labour-intensive and limited in size and therefore, apart from often very expensive not in-the-moment, also lack reliability. We were looking for an additional research method that is less time- and labor-intensive and allows for a larger sample size, hence representativeness. We got in touch with the CircleLytics platform and immediately saw the potential of the dialogue for Salta Group. We started working on this.”

The perfect start

Under the name ‘The perfect start’, the trainers of Salta Group have the ambition to offer students the best possible start to their education. The first dialogue was initiated to find out what the selection of 700 students from different long-term programs with the same starting time think fits a perfect start of a program. Jans: “We received a demonstration from CircleLytics on how the platform works and then started working on the questions ourselves. Together with our team, we determined the right tone of voice, conducted an internal pilot and asked for feedback. We tested the final questions with the CircleLytics team to arrive at the best possible question and that version was sent to the students.

We chose two closed and two open questions. The closed questions were: ‘At what level do you study?’ and ‘What did you think of our perfect start and can you explain that?’ Participants could rate the latter question with a number from 0 to 10 and substantiate it: the why behind it.

The open-ended questions delved deeper into how perfect they thought the start was. “Which things do you think are needed for a perfect start?” and “What suggestions do you have for us to improve the start?”. We were especially curious about the qualitative reactions of course, and they came!

Operation of the dialogue

A dialogue is anonymous, online and consists of two consecutive rounds. In the first round, participants are presented with a number of questions that they can answer. Questions such as: multiple choice questions, closed questions, valuation questions and also open questions. You can also combine closed and open questions, so that you really design and ask a deliberate open-ended question and not just add a text field with ‘comment here’. The number of questions should be carefully determined so as not to overload the participants and to keep focus on the topic. How a question is asked is important in order to receive the right responses, in other words: what answers are you looking for? Can the question be explained in only one way and is the question formulated objectively? Does the question provide enough direction and structure? In the second round, all the answers of the participants from the first round are presented. Participants can now rate and explain or supplement the anonymous answers of the other participants. The smart thing about the algorithm is that participants get to varied answers from others, and they can see as many different answers as possible. This leads to new insights among participants and allows them to think along again. Qualitative answers are rated this way and collect sentiment scores between -3 and +3, expressing to what extent other participants also support or rejection this answer? And why? Learning what they reject is also incredibly valuable: after all, you don’t want to make choices that you know people don’t want. They have a couple of days for each dialogue round, so you don’t have the rush of a focus group, but you dó have the proverbial night’s sleep, which is necessary for reflection. This reflection on our questions and reflection on each other’s answers ensure deepening and validation. They are also allowed to change their closed scale answers, if they want, and on average 60% of participants indeed submit differing final scores. Unlike surveys, you get a high reliability.

High response

Van der Weide: “In our opinion, the response to our first dialogue among the 700 students was high at 30%. We notice that respondents in round two are triggered to respond to ideas of others. That is the power of this system. People are curious about the answers of others and show what is important to them. Participants rated the dialogue at 4+ out of 5 points: also a high score. Of course it has to be fun for them!”

Jans adds: “For us, the big advantage is that everything is weighed in the second round. This makes it a qualitatively weighted study, making it easier and more reliable to create a report with the focal points. We did not see any extremes, but we learned lessons from various themes. We are currently working on implementing those points for improvement in our workflow, so that we have fine-tuned ‘The perfect start’ for the next starting moment. We now have a better understanding of the needs of the students who follow a multi-year course.”

The first day of class

A concrete example that came to the fore was the experience of the first day of a course. Jans: “Despite our communication beforehand, not every student was prepared down to the last detail. We notice a different desired information flow that can differ per education level. We have identified concrete points for improvement from the dialogue, which we are now applying in the next starting moment for students.”

Van der Weide: “We notice that working people, who have had a degree some time ago, sometimes find it difficult to articulate exactly what they expect from a trainer. Quantitative research then provides a too brief picture. Qualitative research is more refined, because in such a focus group you can keep asking questions until you get to the real core. You then use sentences such as: “Other respondents said …” not to direct them, but to give them ideas. However, focus groups are limited in numbers and even then you do not have a complete picture.

The CircleLytics dialogue gives the best of both worlds: The questions and outcomes correspond to qualitative research and the dialogue can be carried out on a large scale. The advantage that everyone can participate at their own time and from their own place is also important. Objectively and without putting words in the mouth, with the CircleLytics dialogue it is faster and with an unlimited number of people to find out exactly what people mean, even on closer inspection after such a second round. Improvements often come in nuances and by knowing them, and knowing what the most important and unimportant points are, you can meet the needs of students. That combination makes the dialogue so interesting to use.”

Would you like to know more about how the dialogue works and how it can strengthen your change processes, research, making, implementing and monitoring plans? Contact us here for an introduction or demonstration. Online or at the office.

Back to top
Close Offcanvas Sidebar