Analytics in Washington

Comments: No Comments
Published on: December 2, 2014

Tony Kornheiser made some interesting statements on the radio the other day regarding what the organization in Washington D.C. should do about their terrible football team. You can read a more in depth piece about those comments here, but the main thrust of the situation was that Washington should start looking at analytics to improve their player selection process.

On the surface, I agree with this position. I am a firm believer in using useful mathematics to improve decision making processes. I started this blog in an attempt to inform people that we can develop individual player analytics in football and predict something about team performance from them.

However, in my humble opinion, Washington will not be the place that the football analytic revolution begins. Mostly I think that because of the actions of the team’s owner, Daniel Snyder. Before I say what I’m about to say, I want you to know that I have no intimate knowledge of Daniel Snyder. I’ve never even met the man. But I can observe his behavior and his public facing behavior leads me to believe something very important about Daniel Snyder and how he may see the world.

It is my impression that Daniel Snyder loves certainty. His behavior seems to follow a profile of X will do Y, Y will do Z, we want Z so therefore let’s go get X. He made a tremendous amount of money entirely on this principle. He had a small company that wanted to do something slightly different, he would go out and acquire another company that did that specific thing, and incorporate it into the original company’s machinery. He’s made his entire living on being able to understand the needs of his organization and then trusting that the assets he spends a tremendous amount of money to acquire will become worth more than what he originally paid for them.

Don’t get me wrong, Snyder’s certainty has served him well in the contexts he was in when he made his billions. Certainty can be a good thing for business leaders, mostly because it allows them to remain leaders. Humans are really bad at differentiating confidence from competence. Projecting certainty creates an environment where people will follow you. So, in some contexts, having lots of certainty makes a lot of sense. The only problem is, football is not one of those contexts.

It’s like my colleague who had difficulty driving on ice. She was originally from California and went to college in Arizona, so she never had to learn about what driving on icy roads is like. In addition, the person who taught her how to drive was a stock car driver. She was taught that to make the most efficient turns, you steer into the bottom of the turn, accelerate quickly through the bottom of the turn, and then steer out on the high side of the corner. And that works well for getting through turns quickly and efficiently. In addition, living in California and Arizona means you never have to confront the fundamental assumption that driving in such a way rests on – traction. If there’s no traction – like say when you’re driving on ice – one must drive in a completely different, opposite way. Accelerating through the bottom of a turn is a really good way to spin your wheels and wipe out. Instead, you have to always ensure that changes in direction never coincide with changes in speed. You can do one, but not the other. In my mind this is what Snyder is doing. He is taking a method that has always worked in the past, applying it to a different context, not recognizing that the underlying reality is different, and wiping out on the ice that is the process of building an NFL team.

You can see this in the future draft capital he gave away to move up to the #2 pick in the 2012 draft to get RG3. Generally speaking, it is a really bad idea to give away future draft picks to move up.  But such a strategy does make sense in a particular light, the light of certainty. If you absolutely feel like you know that this one particular player is going to work out, then it makes every bit of sense to act as Snyder did in 2012. Unfortunately, in football, having such certainty is disconnected from reality. The dirty little secret about player evaluation in football is that nobody knows who’s going to be good or bad. There are too many things to take into account. The amount of error in prediction is so astounding that no human brain can comprehend it. The best mathematical model I can create was accounting for about 15% of what makes a good NFL quarterback at last check. The reality of the NFL says that the way to create a winning team is to stockpile draft picks, evaluate everyone as if they were all drafted in the same round, and repeatedly draft multiple players at the same position (i.e. at least try out a new quarterback every single year).

The mindset you need to build a football team analytically is a mindset of uncertainty. You must accept the general premise that no one knows anything about anyone, the best models will get you 15% of the way to where you need to be, and you need to put yourself in a position to make luck work for you. I do not believe an individual like Snyder – a self-made billionaire used to projecting certainty from a leadership position – would value these qualities. There’s already the story of the economist that Washington hired to do analytic research for them in 2006 who quit after seven weeks of being marginalized in the organization.

The analytic revolution in football is coming quietly. The teams that end up doing analytics very well are not going to make a bit splash about it. The Seahawks and Packers come to mind as teams that, I believe, are on the forefront of the football analytics movement but are not saying a public word about it. Washington is simply not the place where people will be quiet about a new idea. And, ultimately, talking a new analytics department or bringing in some fresh-faced savior with their fancy mathematical model while demanding mechanistic links between actions and outcome will result in utter failure of the analytics process. If Washington brought someone like me into the organization, I feel like Snyder would demand I hit the gas at the bottom of the turn.   And while I might not be certain about much in football I am certain about this. Either I’d have to jump from the car or we’d both wipe out together.

Beholden to Talented Shitheads: Why We Need Analytics

Categories: General Info, NFL
Comments: 1 Comment
Published on: September 9, 2014

I hope everyone is enjoying the new football season. I’m glad to see the Vikings are 1-0 and the defense looked good, although, it was against the Rams so I’m not sure that means all that much.

I don’t have much to talk about in the way of numbers today. We’ve got one week worth of NFL data which will tell us largely nothing about how the rest of the season will play out and we’ve got two weeks of college football data which will tell us something so minor that we probably shouldn’t bother right now.

Instead, I thought I would talk about one of the more important social issues surrounding football right now. I want to talk about Ray Rice and, specifically, what Ray Rice shows us about the importance of adopting analytic strategies for selecting members of organizations.

Many people think that businesses use analytic strategies like skill testing and personality testing because the tests tell you which individual is the most talented, most productive, most useful potential employee and the business then selects the person who comes out on top of the most important tests. And if you think that, you’d be sort-of right about how the process works, but you’d also be sort of wrong.

Most businesses that use analytic strategies use their tests not to find a single individual, but instead to narrow the pool of possible individuals. Tests are used to cull the group, but they generally aren’t used to make a final decision. High scores are necessary to land the job, but they aren’t sufficient. Once the tests identify the proper pool of applications comes the next, and most vital question an interviewing team can ask, “Can we all work with this person?” Fit within the work culture and ability to get along with co-workers is critical to building a functional organization. Any business using this strategy needs to be very careful that their answers to whether they can work with different people are not biased in ways that violate Civil Rights laws or any moral principles that the company holds to, but in general that’s how companies use tests to select employees. Test them all, generate a pool, but don’t select based solely on high scores but rather on more human elements.

That’s the first way analytics helps you build your organization. You can be sure of selecting talented people that are actually the kind of people you want to work with. And that could be important if you’re trying to build a football team. Many coaches seem to have very high minded policies about avoiding players with domestic violence histories. And while they seem to stick to those principles to greater or lesser degrees depending on the talent of the player in question, we can at least see how this would work. If your analytic strategy returns two players as equally likely to succeed and one of them has a history of domestic violence, you probably go with the other one. But that’s not why NFL teams need to quickly adopt analytics.

Using analytics to select employees is critical when one of your talented and valuable employees makes a mistake so horrendous, so unspeakable that it makes you rethink whether or not you would be able to work with that person ever again. Enter our connection to Ray Rice.

What Ray Rice did was unspeakable. But how the Ravens and the NFL responded to the situation is just as unspeakable. And while I can’t speculate on what was going through Rice’s head when he committed his act, I have been associated with enough employee selection meetings to have a guess at what the Ravens were thinking prior to cutting him.

The Ravens, and all NFL teams, are in an industry where talent is incredibly difficult to identify. Highly trained NFL scouts get evaluations of talent wrong every season. It’s a terrible job to try to be good at because almost no one truly knows what it takes to be a great football player. If the organization can’t reliably identify talent, it becomes very guarded about the talent that has fallen into its lap. And when organizations have limited confidence in their ability to find new talent, they are more willing to forgive egregious actions from the talent they actually have. In essence, organizations can become beholden to talented shitheads.

Selecting players using analytic strategies can break that cycle. When a talented member of the organization moves into territory that the rest of the organization can’t follow, it is a simple matter to separate from that person, regenerate a new pool of potential applicants, and begin the selection process all over again. We don’t have to run our rationalizer ragged trying to find reasons why Action X might be morally repugnant, but doesn’t justify removal of the person from the organization. Instead, the incentives for talented individuals to act like a shitheads evaporate. The organization can afford to be less risk-averse when problems with talented players emerge. If the Ravens had a large scale analytics-based selection process they could have cut Rice in February and found two or three shiny new running backs. Instead, we have the nonsense we all saw this week. Honestly, I fail to see how the status quo is better.

The Analyst has No Clothes

Categories: General Info, Statistics
Comments: No Comments
Published on: September 7, 2013

I follow a lot of scouts on Twitter.  Mostly because the nonsense they spout makes me angry and I use that anger as motivation to write.  Once in a great while, though, you find a scout that does things the right way.  Or at least the way you would do things if you had the wherewithal to actually want to do that job.  Matt Waldman is in this latter group.  When Matt talks about his process, he makes me believe he’s got something valid.  He says all the right things and avoids saying the wrong things about how he goes about his craft.  You can tell there is something important going on under the hood.  Also, he’s a hell of a writer.  I have spent days studying how he constructs such compelling sentences.

The point is I respect the dude’s work, which is why I was a little disappointed to read this article on his blog about the process of scouting wide receivers.  It’s not the wide receiver scouting part that bothers me.  It’s the part when he talks about why he is not a fan of “analytics.”

I believe analytics have value, but the grading of wide receivers based heavily on speed, vertical skill, and production is an ambitious, but misguided idea. Further the application is the torturing of data to fit it into a preconceived idea and making it sound objective and scientific due to the use of quantitative data.

That quote was incredibly depressing to read.  Mostly because the reader can so easily tell what the word “analytics” means to an intelligent, quality focused scout.  The context around the word is dripping with disdain toward the self-serving, self-interested analyst.  It seems as though the people doing “analytics” that this author has met are more interested in notoriety and getting paid than delivering an accurate answer.  He goes on to make this point.

I’m trying to do the same from a different vantage point. The more I watch wide receivers, the less I care about 40 times, vertical results, or broad jumps. Once a player meets the acceptable baselines for physical skills, the rest is about hands, technique, understanding defenses, consistency, and the capacity to improve.

I liked Kenbrell Thompkins, Marlon Brown, Austin Collie, (retired) Steve Smith, several other receivers lacking the headlining “analytical” formulas that use a variety of physical measurements and production to find “viable” prospects. What these players share is some evidence of “craft”. They weren’t perfect technicians at the college level or early in their NFL careers, but you could see evidence of a meticulous attention to detail that continued to get better.

Take a look at that second paragraph.  He talks about headlining analytical formulas in reference to physical measurements like 40 times, vertical jump, and broad jump results.  Here is the heart of the issue.  Several places doing respectable analysis (pdf here) have tested whether or not things like 40 times, vertical jumps, and broad predict wide receiver production.  That sort is test is the exact thing that analytics can bring to the table.  Statistical analysis of 40 times, vertical jump, and broad jump results will tell you very clearly if the number is in any way meaningful.  And the answer that comes back repeatedly is the answer Matt has already arrived at.  They’re not useful.  Anyone that thinks they can predict who will be a quality wide receiver based on a 40 time is wasting their breath and your time.  So are there people out there really running around building predictive formulas on 40 times?  If there are, those people should not be listened to.  Furthermore, the idea that such people exist makes me feel like a biker gang member that sees a non-member wearing his clubs rocker.

There is a right way and a wrong way to do statistical analysis.  Knowing the right way is not a trivial thing that you can just dive into without training.  Somewhere, you need to learn the correct way to do it. There are lessons to learn and dues to pay and, to hear Matt talk about his experiences, there are people walking around pretending to have the cache that simply don’t have a clue.

You can see this when you read the ESPN story about the Jacksonville Jaguars “analytics” department.  From my perspective, anyone with a brain should have been able to shred those conclusions and recognize how ridiculous they actually were.  Thankfully, someone at ESPN has both a brain and the ability to write and did it for us.  It should not have taken someone in the press to recognize how terrible that analysis was.  The basic premise of any good statistical analysis starts with the notion that the analyst is wrong.  It is then the analyst’s responsibility to work through every other possibility to find the holes.  And once you reach a point where you can’t see the holes in your own work, you give it to someone else to find holes you can’t see.

Given what I’ve seen when I hear NFL people discussing the advice “analytic” people have given them, it’s no wonder that analytics is having trouble gaining respect in NFL circles.  It seems there are a bunch of people talking to NFL decision makers whose analytic methods should be severely questioned.  If what the Jaguars and some other NFL teams are doing with numbers is considered “analytics,” I’m not sure I want to be associated with that term.

page 1 of 1
Welcome , today is Monday, October 23, 2017