NBA PHDs vs Geeks
4 posters
Page 1 of 1
NBA PHDs vs Geeks
Ex-Player Believes Analytics Hurting Those With PhDs In Basketball
Mar 26, 2014 12:30 PM EDT
Chris Broussard of ESPN wrote a piece with quotes from former NBA players expressing concern about a divide between "stat guys" and "basketball lifers".
"Basketball guys who participated in the game through years of rigorous training and practice, decades of observation work through film and field participation work feel under-utilized and under-appreciated and are quite insulted because their PhDs in basketball have been downgraded," said the former executive to Broussard.
"The [analytics] narrative is hurting basketball PhD thinkers right now," the ex-player said. "However, if numbers never lie, the basketball PhD thinkers have won more championships by far than the uneducated analytics guy."
While there has been a move away from hiring former NBA players as general manager, the recent hires have strong basketball background.
Rob Hennigan played college basketball for Division III Emerson College and is the school's all-time leading scorer.
Sam Presti's first job out of college was as an intern for the San Antoino Spurs.
Pete D'Alessandro was a video coordinator for Lou Carnesecca before working as an agent and as Chris Mullin's righthand man with the Golden State Warriors.
Masai Ujiri played two years in college and six in Europe before getting into scouting and then advanced metrics.
Bob Myers played four seasons at UCLA before becoming an agent.
Ryan McDonough spent his entire professional career working for the Boston Celtics before being hired as GM of the Suns.
"Hybrids -- double-majors in Basketball and Math, not full-on quants -- are the real future of the NBA GM position," writes Ziller.
Via Tom Ziller/SB Nation
bob
MY NOTE: NBA Geeks-R-Us. The statement that "The [analytics] narrative is hurting basketball PhD thinkers right now. However, if numbers never lie, the basketball PhD thinkers have won more championships by far than the uneducated analytics guy" is weakly founded since the analytics guys haven't been around for that long. I mean, Red and Phil and Pop and Riley etal didn't have access to that stuff. Yeah, I know Red eschewed such stuff, but maybe he was a "Da Vinci-like genius" that could intuit things that others can't and that's why they need/use metrics. Danny Ainge believed in "brain typing" and used John Niednagel of the Brain Typing Institute to filter players for personality tendencies (I don't know if Danny uses him anymore, not after Niednagel tested Danny for his brain type and told him he had an IQ of about 70). Never heard the expression "quants" before. It's like calling a software engineer a "prog". I think the last statement makes the most sense. Nothing beats experience, however, a solid education in analysis is unquestionably valuable and that's what the metrics part is. The experience helps to alert you where the numbers are leading you astray.
.
Mar 26, 2014 12:30 PM EDT
Chris Broussard of ESPN wrote a piece with quotes from former NBA players expressing concern about a divide between "stat guys" and "basketball lifers".
"Basketball guys who participated in the game through years of rigorous training and practice, decades of observation work through film and field participation work feel under-utilized and under-appreciated and are quite insulted because their PhDs in basketball have been downgraded," said the former executive to Broussard.
"The [analytics] narrative is hurting basketball PhD thinkers right now," the ex-player said. "However, if numbers never lie, the basketball PhD thinkers have won more championships by far than the uneducated analytics guy."
While there has been a move away from hiring former NBA players as general manager, the recent hires have strong basketball background.
Rob Hennigan played college basketball for Division III Emerson College and is the school's all-time leading scorer.
Sam Presti's first job out of college was as an intern for the San Antoino Spurs.
Pete D'Alessandro was a video coordinator for Lou Carnesecca before working as an agent and as Chris Mullin's righthand man with the Golden State Warriors.
Masai Ujiri played two years in college and six in Europe before getting into scouting and then advanced metrics.
Bob Myers played four seasons at UCLA before becoming an agent.
Ryan McDonough spent his entire professional career working for the Boston Celtics before being hired as GM of the Suns.
"Hybrids -- double-majors in Basketball and Math, not full-on quants -- are the real future of the NBA GM position," writes Ziller.
Via Tom Ziller/SB Nation
bob
MY NOTE: NBA Geeks-R-Us. The statement that "The [analytics] narrative is hurting basketball PhD thinkers right now. However, if numbers never lie, the basketball PhD thinkers have won more championships by far than the uneducated analytics guy" is weakly founded since the analytics guys haven't been around for that long. I mean, Red and Phil and Pop and Riley etal didn't have access to that stuff. Yeah, I know Red eschewed such stuff, but maybe he was a "Da Vinci-like genius" that could intuit things that others can't and that's why they need/use metrics. Danny Ainge believed in "brain typing" and used John Niednagel of the Brain Typing Institute to filter players for personality tendencies (I don't know if Danny uses him anymore, not after Niednagel tested Danny for his brain type and told him he had an IQ of about 70). Never heard the expression "quants" before. It's like calling a software engineer a "prog". I think the last statement makes the most sense. Nothing beats experience, however, a solid education in analysis is unquestionably valuable and that's what the metrics part is. The experience helps to alert you where the numbers are leading you astray.
.
bobheckler- Posts : 62620
Join date : 2009-10-28
Re: NBA PHDs vs Geeks
"Quants." The only place I've heard that term applied is to advanced statistical and mathematically driven stock analysis. I think it's short for "quantitative analysis."
The jury's still out on whether such methods provide statistically significant enhanced performance. In other words when quants analyze other quants the results are inconclusive, at least in the world of stock picking.
I can hear Auerbach sneering at all these new numbers driven metrics. Red often thundered, "The game is simple" followed by words to the effect of you get talented players who are good people and you let nature take its course.
Jerry West in his 1970 book "Mr Clutch" said that he didn't think Auerbach was as technically sharp as some coaches but that he also thought that wasn't very important. All Mr Clutch knew was that Red's teams were always superbly conditioned, superbly motivated, unselfish, ready to play and, of course, always won.
I wouldn't be surprised if the new metrics can be to shown to provide a team with a razor-thin advantage that could make a difference but I'd take getting talented players who are good people over a primarily quant approach any day.
As far as having players tested by a shrink, I think that one of the first teams to do that was the Portland Trailblazers in the early 70's. David Halberstam in his "Breaks of the Game" writes about how a shrink issued a report to the Portland coaching staff that he did not think hot rookie Sidney Wicks would ever become a great player, finding that Wicks was talented and very intelligent but a mass of contradictions at war with himself.
The coaches discounted this, and really dismissed it after Wicks had a great rookie year, winning rookie of the year, playing in the all-star game and becoming one of the handful of first year players to score 2000 or more points.
Of course, Wicks's play declined every year and reached its nadir in Boston.
The jury's still out on whether such methods provide statistically significant enhanced performance. In other words when quants analyze other quants the results are inconclusive, at least in the world of stock picking.
I can hear Auerbach sneering at all these new numbers driven metrics. Red often thundered, "The game is simple" followed by words to the effect of you get talented players who are good people and you let nature take its course.
Jerry West in his 1970 book "Mr Clutch" said that he didn't think Auerbach was as technically sharp as some coaches but that he also thought that wasn't very important. All Mr Clutch knew was that Red's teams were always superbly conditioned, superbly motivated, unselfish, ready to play and, of course, always won.
I wouldn't be surprised if the new metrics can be to shown to provide a team with a razor-thin advantage that could make a difference but I'd take getting talented players who are good people over a primarily quant approach any day.
As far as having players tested by a shrink, I think that one of the first teams to do that was the Portland Trailblazers in the early 70's. David Halberstam in his "Breaks of the Game" writes about how a shrink issued a report to the Portland coaching staff that he did not think hot rookie Sidney Wicks would ever become a great player, finding that Wicks was talented and very intelligent but a mass of contradictions at war with himself.
The coaches discounted this, and really dismissed it after Wicks had a great rookie year, winning rookie of the year, playing in the all-star game and becoming one of the handful of first year players to score 2000 or more points.
Of course, Wicks's play declined every year and reached its nadir in Boston.
Sloopjohnb- Posts : 638
Join date : 2013-12-29
Re: NBA PHDs vs Geeks
There is a craze to overanalyze things in this day and age. Keeping track of statistics and trends is fine, but it's overblown and not quite as important as people believe.
People, not metrics or analyses, are the most important factor in the equation (couldn't help the arithmetic pun).
KJ
People, not metrics or analyses, are the most important factor in the equation (couldn't help the arithmetic pun).
KJ
k_j_88- Posts : 4748
Join date : 2013-01-06
Age : 35
Re: NBA PHDs vs Geeks
Agreed 100%, KJ. In statistics, there are two key areas: validity and reliability. An oversimplification is that reliability pertains to how reliably data may be projected to a given "population" and that validity (of which there are several types) pertains to whether you're measuring what you think you're measuring in the first place. Reliability may be quantitatively measured (as in the +/- margin of error you see quoted in polls), while validity is usually evaluated through subjectivity or logic to ensure that statistics are being used correctly.
In short, it takes attention to both validity and reliability to make for the proper use of statistics. It's no different for basketball stats. When I criticize the stats of the ersatz stats of Hollinger, it's seldom due to criticism with reliability. It's usually related to my own subjective concerns about validity. There's obviously a place for both quantitative measures and subjective inputs on basketball matters. I happen to place more emphasis on validity because you can have validity without quantitative support, but quantitative measures don't mean anything without passing the test of validity.
That's why I believe Red et al. could have a fantastic track record by using primarily subjective instincts honed by their knowledgeability. Those who depend primarily on quantitative analytics need to reassure themselves that those statistics have passed the tests of validity.
Now, if anyone understood what I just said, perhaps you'd be good enough to explain it to me. LOL.
Sam
In short, it takes attention to both validity and reliability to make for the proper use of statistics. It's no different for basketball stats. When I criticize the stats of the ersatz stats of Hollinger, it's seldom due to criticism with reliability. It's usually related to my own subjective concerns about validity. There's obviously a place for both quantitative measures and subjective inputs on basketball matters. I happen to place more emphasis on validity because you can have validity without quantitative support, but quantitative measures don't mean anything without passing the test of validity.
That's why I believe Red et al. could have a fantastic track record by using primarily subjective instincts honed by their knowledgeability. Those who depend primarily on quantitative analytics need to reassure themselves that those statistics have passed the tests of validity.
Now, if anyone understood what I just said, perhaps you'd be good enough to explain it to me. LOL.
Sam
Page 1 of 1
Permissions in this forum:
You cannot reply to topics in this forum