Luke Skywalker’s instincts and R2D2’s artificial intelligence make them unbeatable. The next unstoppable human-AI duo may just sit on basketball team benches, says a Carolina computer scientist.
Gedas Bertasius, an assistant professor in the College of Arts & Sciences’ computer science department, knows AI. As a former Lithuanian junior national team member and Dartmouth College player, he knows basketball.
AI, Bertasius says, combines human efforts — such as writing code that tells a computer how to carry out a task — with the computer’s ability to learn and process analytics. In his research, he is trying to decrease the human part and increase a computer’s ability to think fluidly during video content analysis. He writes code language that gives general directions to the computer for viewing real-time or recorded video. The computer could learn to recognize flames, someone breaking into a building, an error in manufacturing or an opposing point guard’s tendency to dribble or shoot off a screen.
Basketball AI requires humans to watch hours of video while starting and stopping as they click or “tag” players’ elbows and legs to tell a computer what to analyze, Bertasius says. “It’s very costly, but a benefit may be that you can use the annotations to train your model. There are different ways to label. Many people focus on a person’s pose to understand how they move and do things. That’s even more costly because labeling each person’s joints, for example, might require 10-to-20 clicks for each person.”
Bertasius espouses a less tedious method. Instead of 10-to-20 clicks, he applies a single “label” or model of coding language to videos so that a computer recognizes the action. Take, for instance, a common basketball play called a pick and roll.
A “pick” is a legal screen by a defender, who stands stationary in front of an offensive player to enable a teammate to shoot or pass, sometimes to the screener who has pivoted or “rolled” to get open. Instead of low-level instructions such as labeling joints, his lab would place single labels on 100 videos. Their hope is that the “this is a pick and roll” labels jump-start the computer’s ability to identify and analyze the play in other videos.
“The learning process is not super-efficient because you need lots of data examples,” Bertasius says. “But at the same time, it’s better because you don’t have to specify lots of low-level details.”
Bertasius and his graduate students have created coding models that enable servers with 30-to-40 graphical processing units to recognize activities such as running and swimming in a dataset of 300,000 YouTube videos. They didn’t tell the machine that moving legs signify running. Instead, the model allowed the computer to learn and adapt.
The AI operates much like the human brain, processing millions of parameters but without a way to see its thinking processes. “You can’t dig into it and understand what’s going on. A lot is based on intuition,” Bertasius says.
Case-in-point: Bertasius says that without coding instructions for specifics of an action, the coding model enables AI to learn that “on a three-point shot, the person has to be behind this line and the ball has a specific trajectory. It’s learning a pattern and correlates that pattern.”
How the Tar Heels use AI
At Carolina, the women’s basketball team is making long-term plans for analytics use, including using AI for increased efficiency — things like assessing each player’s development and evaluating high school recruits. But, during the season, most of their time is spent on preparing for opposing teams.
Sam Miller, the team’s director of scouting and video operations, is a 30-year basketball veteran as a player and coaching staff member. He says that an NBA-led boom in analytics began in 2013. Under Head Coach Courtney Banghart, he says, the entire staff is open to what analytics can do for the team.
The Tar Heels gather data on opponents from several sources. One is Synergy, a company that uses a proprietary method to analyze video through recognition of player movements and team patterns. Every NBA team, hundreds of college teams, as well as teams in Europe, high schools and the Amateur Athletic Union use Synergy.
Synergy’s analysis happens as games are televised. Reports are often ready minutes after a game ends.
From Synergy’s dashboard, Miller can choose an upcoming opponent from a dropdown menu to see all kinds of reports on that team and its individual players. “It’s all the algorithm,” Miller says. “Their people are tagging it. They’re teaching a computer to follow tags.”
He clicks on a player’s name. “You can literally see what they do without ever watching them.” To assess the player’s tendencies, he looks at play types, clicking on the heading for pick and roll. With 85 videos of the player running a pick and roll, Miller can sort them multiple ways. What happens when she’s trapped or another defender hedges over? If a screen is set on the defender’s right side, what direction does she dribble? Miller might see, for instance, that 81.7% of the time she dribbles off a screen.
Scouting with AI
When scouting opponents, Banghart’s staff work in pairs about a week before a game. They start with a broad view by watching five-to-eight games.
For Miller and Assistant Coach Adrian Walters, the videos provide a sense of the opponent’s pace of game and their offensive and defensive schemes. “Basketball’s free flowing. It’s about pace,” he says. “If you watch parts of games or sections of what a player does or just offense or just defense, you miss the back and forth.”
They discuss what they saw and re-watch key segments. They might zero in on what they consider crucial as they did recently on an opponent’s deliberate, slow-paced offense. They add those observations and notes on how to handle them to their scouting report.
Then the analytic use begins in earnest. They scan Synergy’s reports for potential weaknesses or places to attack and look for statistical outliers — a team has a lower-than-usual shooting percentage or allows more points than its average — and their causes.
Next, using video editing tools from SportsCode, Miller and Walters create and tag clips of the opponent’s tendencies. SportsCode translates the tags into analysis of each clip they want to show the team. In what Miller calls a “constant, rolling experiment,” the duo uses the SportsCode and Synergy dashboards to compare Carolina’s players, the upcoming opponent’s players and the players and teams that played the opponent.
Two days before tip-off, the duo shares their report with entire staff. Later that day, the team watches selected videos while coaches talk about key points. The team will then have two days of practice against a squad of male students who mimic the opponent.
AI on the bench?
Bertasius anticipates that AI will have a seat on the bench in the near future. Coaches will draw plays on a tablet as their teams compete, then the tablet’s AI model will offer plays and lineups with the highest probability of success.
“The amount of information coaches crunch through is insane. In the game it can be difficult to react to situations,” he says. “Pre-game planning certainly informs in-game decisions, but coaches have to decide in a matter of seconds how to react or what play to run for a game-winning shot.”
Still, the computer scientist respects the human ability to adapt to new situations.
“Humans can’t crunch through that much data and remember everything, but we’re much better at creativity, improvisation and responding to new situations.” Say a weaker defensive player enters the game. Coaches notice and take advantage of such things. “Target that player for the three minutes he or she is in the game,” Bertasius says. “Maybe you’ll get four or five extra points out of that. AI is not necessarily going to show how to handle that situation well.”
It seems like AI and coaches will complement each other.
“It’s almost like having this super memory with easy access to thousands of games. You look at the information and decide if it’s useful for you or not,” Bertasius says.