**Tl, dr:** Graphs with the calculated champion similarities are found in the middle of this post. A player is likely to play champions close to each other in these graphs.

Hello everyone

I calculated champion similarities using neural net embeddings based on champion mastery points and wanted to share the results with you all. It is a way to determine which champions are played by the same person, so the output can be used as a recommendation engine. I did this because I work as a data scientist at a fairly large retailer and wanted to know more about neural net embeddings as that could improve some neural nets I made in the past to predict various things and to improve my knowledge about recommendation engines. We get some time every week for continued education at work, so this was an educational project it did in that time.

I wrote this post as I hope it will be an interesting read for you.

**Table of contents**

- Introduction
- Data collection
- Neural net embedding and loss function for champion similarity
- Graphical representation of the calculated champion embeddings
- Additional tables
- Thanks for reading

**Introduction**

The idea behind this project is basically to try to calculate champion similarity based on champion mastery points for many player accounts. The reasoning behind using mastery points is that if a player likes some champions, they are likely to be similar in some way (e.g. I personally like to play enchanters and tanks) so the player will have high mastery points for these champions compared to the others. This is similar to what is behind a recommendation engine, I want to calculate similarity of champions for the players, and could then recommend a player champions close to their main champion to try out.

To accomplish this I used neural net embeddings following

Assume we have 5 champions and embed them into a 3-dimensional space. This will give us a matrix like the following one:

Champion | Dimension 1 | Dimension 2 | Dimension 3 |
---|---|---|---|

Amumu | 1 | ||

Ahri | 0.1 | 0.9 | |

Syndra | 1 | ||

Yasuo | 0.1 | 0.9 | |

Zed | 1 |

The dot product is calculated by multiplying two vectors and then suming up the result, *e.g.*

`dot(Amumu, Ahri) = 1*0.1 + 0*0.9 + 0*0 = 0.1 `

For the example matrix above, this would result in a similarity table like this:

Amumu | Ahri | Syndra | Yasuo | Zed | |
---|---|---|---|---|---|

Amumu | 0.1 | ||||

Ahri | 0.1 | 0.9 | 0.09 | ||

Syndra | 0.9 | 0.1 | |||

Yasuo | 0.09 | 0.1 | 0.9 | ||

Zed | 0.9 |

So for this example embedding, there would be a big similarity between Ahri/Syndra and Yasuo/Zed.

After learning the embeddings, they are reduced onto two dimensions using a dimension reduction technique. I will use t-SNE, and also provide the result from

MDS as you will see that the choice of dimensionality reduction has a visible effect on the output.

The code for this can be found

**Data collection**

A lot of data is usually necessary to reliable train neural networks. To do so I accessed the Riot API to download the mastery points for 120'000 accounts on both EUW as well as NA (I might also do Korea to compare in the future). To get the account names, I started by looking at my account (somewhere in gold I think) and get the account names for my last 100 played games. I repeated this for ~500 randomly selected accounts, which gives me a library of ~300'000 accounts from which I randomly selected the 120'000 accounts.

For all of these accounts I downloaded and saved the mastery points for all but the newest champions (the newest champion considered is Yone) for a total of 150 champions into the local database.

**Neural net embedding and loss function for champion similarity**

After downloading the data, I excluded one-trick accounts (remember that I want to recommend you another champion, not to be a one-trick) which I defined as having a champion with more than 50% of all the champion mastery points on the account. I also excluded accounts with less than 100000 champion mastery points over all champions.

For anyone interested, the neural net including the dot product is defined as

`X = mx.sym.Variable('data') y = mx.sym.Variable('label') symEmb = mx.sym.Embedding(data = X, input_dim = nChamps, output_dim = nDimEmbedding) symEmbChamp1 = mx.sym.slice_axis(symEmb, 1, 0, 1) symEmbChamp2 = mx.sym.slice_axis(symEmb, 1, 1, 2) symEmbReshape1 = mx.sym.reshape(symEmbChamp1, (-1, nDimEmbedding)) symEmbReshape2 = mx.sym.reshape(symEmbChamp2, (-1, nDimEmbedding)) symSkalarProdWinkel = mx.sym.sum(symEmbReshape1 * symEmbReshape2, axis = 1, keepdims = True) symFehler = mx.sym.LinearRegressionOutput(symSkalarProdWinkel, y) `

together with a custom data iterator. The embedding layer has 15 dimensions. The data iterator selects randomly (but skewed towards the more played champions) 15 champions for a random account and calculates the geometric means of the champion mastery point ratios for all the combinations which are the target variables for the neural net:

`geom(Champ_1, Champ_2) = sqrt((Mastery_1 / sum(All mastery points)) * (Mastery_2 / sum(All mastery points))) `

The loss function between the dot products and the target variables is a standard MSE-error.

Each epoch for the neural net training consists of the combination of the 15 randomly selected champions for 10'000 randomly selected accounts, repeated over 100 epochs.

**Graphical representation of the calculated champion embeddings (not mobile friendly, but what can you do with 150 champion-icons, sorry)**

The calculated embeddings after neural net training is a 150 by 15 matrix which is nearly impossible to visualize directly. To overcome this, we need to reduce the dimensionality to a displayable amount (namely 2 dimensions), for which I used

t-SNE. The results look as follows (if you miss your champion, it can happend that two champions are so close together that one icon is completely covered by the other, *e.g.* Orrn is behind Urgot for EUW):

EUW:

We can nicely see the ADC cluster (with Ziggs) at the bottom and the supports to the bottom left, separated into enchanters, tanks and catchers. And Zyra (my old main) somewhere hanging in there. Lux is also more support than midlane-mage. There is also an assassin/edgelord cluster top left. Interestingly, LeBlanc is located with other mages, not other assassins. In the center and top-right we have tanks and junglers with fighters/juggernaughts being on the right, except Irelia which is in the edgelord-cluster.

We can also see champions like Lillia, Nidalee, Qiyana, Quinn, Ivern, Aurelion Sol and Yorick far from other champions. This is to be expected as they have unique playstyles or attract one-trick players as they don't have other similarly played champions.

NA:

While NA looks fairly similar to EUW, there are some differences where the clusters are located relative to each other. We can discuss individual champions or clusters further in the comments.

As a comparison of the effect the choice of dimensionality reduction technique has, I also want to show the results from applying MDS on the trained neural net embeddings.

EUW1:

Compared to t-SNE, the champions are more evenly spred out. Overall the same clusters as seen in t-SNE still exist, but are way less visible. But you can see the champion icons better here as they overlap less.

NA:

**Additional tables**

As I downloaded all the champion mastery points anyway, I also want to show some more information on them which I found interesting.

Here is a table of the ratio of accounts which had no mastery points for a given champion (out of the 120'000 accounts):

Not surprisinlgy, a lot of players have not played the newer champions, but also Ivern and Skarner are up there. On the other end, almost everyone has played at least a single game of Ashe or Lux. In addition, it seems that EUW players tend to play/try out a little more different champions than NA players.

Here is a table for the total sum of all mastery points for the different champions (out of the 120'000) accounts as well as the ratio of these mastery points compared to the total sum of all mastery points:

I don't think we have to discuss who ~~will get the next skins~~ are the most popular.

**Thanks for reading**

Thanks for reading so far. It was a really interesting small project for me and I hope you found something in here that got you thinking. Again, I did this as a personal continued education project and have no affiliation with Riot. If you have questions, I'll try to answer them in the comments. Tldr is at the beginning.

Have a good day 🙂

Source: **Original link**

©Post "Champion similarities based on neural net embeddings, a mastery point based recommendation system if you want to know which champion to play next" for game League of Legends.

### Top 10 Most Anticipated Video Games of 2020

2020 will have something to satisfy classic and modern gamers alike. To be eligible for the list, the game must be confirmed for 2020, or there should be good reason to expect its release in that year. Therefore, upcoming games with a mere announcement and no discernible release date will not be included.

### Top 15 NEW Games of 2020 [FIRST HALF]

2020 has a ton to look forward to...in the video gaming world. Here are fifteen games we're looking forward to in the first half of 2020.