Feedback on how well your design vision translates into the players experience is important for any designer and process for consistently making better games. Although there are elements of games where the player is looking for an efficient interaction, such as when using the menus, efficiency is not the main goal of games.
Instead challenge is an important part of games, and a key part of fun. The difficulty for game designers is making the right kind of challenge, and it’s up to user researchers to help define this. As well as discovering usability issues, balancing issues, researchers therefore have to try and discover if players are having fun.
Game designers are often maligned, definitely so in Rainbow Six: Siege when the sum of many moving parts falters and presents itself as a bug or a balancing issue, for example, especially in competitive play. However, criticism needs to be measured and needs to be constructive to allow the game designers to carry out their duties effectively. What's more, it is important to remember that every game designer is, at the end of the day, human after all.
SiegeGG had a chat with one of these humans, Julien Huguenin, the User Research Project Manager on Rainbow Six: Siege, to talk about his work and the Pro League Finals in Milan:
Firstly, can you explain to us what exactly your job entails?
I’m the User Research Project Manager of Rainbow Six: Siege, which means that my role is to drive the research on our players, and pilot the efforts of a team of User Researchers with a background in cognitive science and game design. My expertise is more on the Qualitative feedback side of things, and we focus on bringing players (pros or not!) in our Development studios to test upcoming content. As a whole, we are part of a larger team of data scientists and community experts called the Game Intelligence team. Together, we analyze and cross the data and feedback generated by our players, as well as track what’s happening on social media. To that, we add what we can observe in our User Research Lab test rooms, as well as internally question how we feel about new and existing features on R6, etc. We all work together with the game designers to make sure the development team can take the most informed decisions for the game.
What social platforms do you use to track what the community thinks?
Reddit, Twitter, Discord, Ubisoft forums, Steam… However, that’s only a small fraction of the info we get from players. We also track a lot of what’s going in live games, we bring players of all level in our Studios to test content, we send out surveys, etc. Of course, what the community expresses on social media is important, but that’s not the whole picture for us. The behaviours we can observe in-game through tracked data are also critical for our understanding of R6, especially because an important percentage of our players don’t engage on social media that much about Rainbow Six.
How does the data collected influence competitive and non-competitive Rainbow Six?
The data and feedback we collect is used to improve and balance the game at all levels of play, which means that what we observe, measure, and hear from all of our players drives our development efforts. Pro League and high-level players’ data is used to balance the game at the highest level. In addition, frustrations expressed by the community (through surveys and social media) is used to do balancing and add/change features that will impact our entire community. A good example of that would be the development of the Pick & Ban feature, or the changes to the Casual playlist.
Do you prioritize certain types of feedback? If so, how does that influence the conclusions of the User Research team?
Each type of feedback/data has a different purpose, with their own value and limitations. Our objective is always to try and make the game enjoyable for the entirety of our playerbase, and have a high level of play that is interesting to compete in and watch. To achieve that, we try to cross all of our “Insight” sources to get a global picture of what’s going on at any given time. For example – when we balance an operator, we look (among other things) at the pick rate/win rate at low levels, high levels, and the Pro League. We also send surveys to broad samples of the community, look at discussions on Reddit and Twitter and we have recurring workshops with some of the best players and minds in the world. That being said, data from people who play at a higher level tends to paint a clearer picture that is easier for us to work with, since their behaviour is much more consistent from one match to the other, and so is the data that is generated from their games. On the other hand, if we work on a feature targeted at players with less experience (like the newcomer playlist), we will focus on the behaviours & data of said, less experienced players. Moreover, higher level of play tends to react much faster to patches and balancing changes, so when it comes to game balance, it’s often the first place we’ll look to to get an idea on how the game evolves. On the contrary, when we want to push for larger changes that will equally affect most users (like the changes to Casual rules), we investigate the feedback and data of the population at large. Everything is leveraged towards specific goals and we spend a lot of time thinking about which insight source we should look at to address any specific topic. And lastly – contrary to what a lot of people think – there is actually a lot of convergence between different data sources from different types of players. The most frustrating operators, or least-liked maps for example, tend to be similar from Bronze to Diamond, with some exceptions, different proportions, and sometimes for different reasons.
The Newcomer playlist is a way for newer players to play against similarly skilled opponents. What effect do you think this has had on the newer additions to the Siege community?
The newcomer playlist offers a space where new players can learn to play with a limited number of maps and game modes. Based on our studies, trying to master too many maps and sites at once slows down the learning process. The Newcomer playlist is a perfect place to start small, while training on maps and game modes deemed competitive enough that anything you learn there will always be useful in other maps in the future.
When different skill levels/communities show different trends in the data (for example, Ying was better in Pro League than she was in standard play), what steps do you take in response to that?
To be honest, there is no magic solution when this happens. What we do is try to define and isolate what makes operators, gadgets, or weapons too strong in either high or lower level of play, and address that specifically without breaking the fundamentals of the game and changing the whole operator. In the case of Ying, she worked very well with other Operators to deliver a strong, coordinated and timed push. Those pushes are less likely to happen in ranked or casual, so she is mostly fine in that state. That’s why we focused our changes around making it harder to get the most out of her kit in a short few seconds, without making her fundamentally different. That’s our approach in most cases: break down what makes an operator problematic in a specific context and focus on that.
Who do you think is going to be winning at Milan?
The overall level of the eight qualified teams is super high across all regions, and with G2 Esports out of the picture, I feel like any team has a shot. That being said, Team Empire is still very strong and consistent, so I would bet on them!
---
The Milan finals are taking place on the 18th and 19th of May, where eight teams will fight for the title of season 9 Pro League champion. To keep up to date on Milan coverage before, during, and after the event, be sure to check back here at SiegeGG.