Mining Your Own Business Podcast

Season 2 | Episode 6 - Unveiling the Secrets of Successful Analytics Projects with Dr. John Elder

Quote from Dr. John Elder about successful analytics projects: “It helps to have teams that are honest in their feedback … You see it as a wonderful thing if they find a flaw and save you time and help you solve the problem.”Join us as we chat with Dr. John Elder, founder of Elder Research, on this episode of Mining Your Own Business. In his decades of experience, John has witnessed the ebb and flow of countless analytics projects. He’s a problem solver who thrives on finding solutions to complex challenges, and along the way he’s discovered keys to achieving success.

Tune in as host Evan Wimpey chats with John about the insight he’s gained along the way. We hope you take away some helpful tips and perspective for your organization!

In this episode you will learn:

  • Why successful data analytics projects rely on teamwork, honesty, and perseverance
  • How Elder Research has approached challenges and helped organizations move from idea to implementation
  • The importance of choosing the right problems to work on as an organization
  • The impact of involving leaders in data-driven initiatives to achieve success and sustainability

Learn more about why we created the Mining Your Own Business podcast.

About This Episode's Participants

Dr. John ElderJohn Elder | Guest

Dr. John Elder founded Elder Research, America’s largest and most experienced data science consultancy, in 1995. With offices in Charlottesville VA, Baltimore MD, Raleigh, NC, Washington DC, and London, they’ve solved hundreds of challenges for commercial and government clients by extracting actionable knowledge from all types of data. Dr. Elder co-authored three books — on practical data mining, ensembles, and text mining — two of which won “book of the year” awards. John has created data mining tools, was a discoverer of ensemble methods, chairs international conferences, and is a popular workshop and keynote speaker.

Dr. Elder earned Engineering degrees from Rice and UVA, where he’s an Adjunct Professor. He was named by President Bush to serve 5 years on a panel to guide technology for national security. Lastly, John is grateful to be a follower of Christ and the father of five.

Follow John on LinkedIn

Photo of Evan WimpeyEvan Wimpey | Host

Evan Wimpey is the Director of Analytics Strategy at Elder Research where he works with organizations to transform deficient data into tangible business value that advances their mission.

He is uniquely suited for this challenge by pairing his professional experience in management and economics at high-functioning organizations like the Marine Corps and Goldman Sachs with his technical prowess in data science. His analytics skillset was strengthened while earning his MS in Analytics from the Institute for Advanced Analytics at NC State University.

Evan almost always has a smile on his face, which is at its widest when he is helping organizations use data in innovative ways to solve complex problems. He is also, in a strictly technical sense, a “professional” comedian.

Follow Evan on LinkedIn

Key Moments from This Episode

00:00 Introduction
00:54 Discussing the factors that separate successful and unsuccessful analytics projects
02:31 Sharing stories of how Elder Research has built project success over time
05:09 The importance of leadership involvement to achieve success in projects
07:14 Talking about John’s favorite projects
13:30 Discussing the need to follow up after a project is successfully completed
15:19 The growth and impact of generative AI
18:36 Talking about the importance of reliability of analytic tools
19:54 Discussing John’s transition into a new role at Elder Research
21:02 The vital steps to problem-solving
22:33 Sharing how John’s early work in target shuffling was included in Dr. Eric Siegel’s “Predictive Analytics” book
25:05 Discussing John’s current interests, including advancement of his global search algorithm and investment modeling
29:43 Wrapping up the show

Show Transcript

Evan Wimpey: Hello and welcome to the Mining Your Own Business podcast. I’m your host, Evan Wimpey. And today I’m excited to introduce our guest. And I say this with every show and I, I mean it with every show, but, but sometimes I can mean it a little more than others and be a little more excited. When you heard the intro to the show, you heard the brought to you by Elder Research.

I’m introducing our guest, Dr. John Elder. And that name is not a coincidence. He is the founder of. What some would call the greatest consultancy of all time, Elder Research. And he has been in the space going on 30 years. He has seen and done and contributed a lot in this space. We are very excited, John, to have you on the show today. Thanks so much for coming.

John Elder: Thanks, Evan. You didn’t have to really.

Evan Wimpey: Yeah, there we go. I’ll, I’ll take the check later. Thanks, John. John, you have been in this space for a while. You’ve done a lot of really interesting and really cool analytics. You’ve seen a lot of analytics projects with a lot of different clients, government, commercial clients.

You’ve seen a lot of them be very successful. You’ve seen, hopefully, not as many, I think not as many, but you’ve seen some that weren’t as successful for a lot of different reasons. Can you talk maybe just generally in themes that you’ve seen, what sort of separates the successful from the not successful projects?

John Elder: Yeah, great question. We kind of did some data mining on our data mining. And let me list sort of the different little clusters of failures, if you will, like, first of all, point out that most of our ideas fail. Most of the experiments we try, most of the hypotheses we have, but almost all the projects succeed.

So you just kind of, you have to have a, as a, as a data scientist, you have to have a good hearty attitude, you know, confidence is going to work. Eventually we’re going to keep poking this thing and trying different things and until we find something and it really helps to have teams that are honest in their feedback.

You know, where your partner, your, your colleagues are trying to break what you’re doing for the good of everyone. You know, and you don’t take it personally. You, you see it as a wonderful thing if they find a flaw and save you time and help you solve the problem. So we, I think we successfully have built a culture of that where we were sincerely interested in solving the client’s problem and making them.

Happy and successful, and that means, kind of checking your ego at the door and working cooperatively together. So we sometimes we solve problems that we find out later, , on an oil and gas exploration problem, for instance, we found out later that four other teams had failed on the problem in earlier years.

So we, we feel like we, we really are the best at solving really hard problems. If we look at our failures, which are a few, right now we’re about 99 percent successful in we were about 90 percent successful in our first decade in technically solving the problem, but then only 60 percent of the projects actually got implemented.

So there’s kind of a couple levels of success. The technical success, meeting all the objectives, and then do they actually use it or not. Now the second decade of our, so we’re coming up on our third, so we need to update the stats. But the second decade that moved to 99 percent technical success and about 90 percent implementation success.

And I’ve heard that the industry wide implementation rate is more like 20 percent or less. So we feel like we’re really, our long experience has helped us do things from the very beginning that helped implementation happen. But back to the failures. Don’t do anything in the markets. Don’t try to predict the stock market.

We did have a very successful, one very successful, very long success with the hedge fund, but most everything else has been very hard. And you think about it, there’s thousands of people working very hard every day, and markets are kind of self-correcting. And likewise, don’t do horse racing don’t do other there’s, even though there’s an edge of information we could find.

It wasn’t great enough to beat the built-in extra costs that the establishments you know, incur. And then don’t do anything for startups. They, they want to pay you in pieces of paper that are not recognized at the grocery store. And they also pivot sometimes pretty radically from, you know, making robots to serving Fritos. I don’t know, you know, they just sometimes, it’s pretty crazy. So, those, those, and it does help if the client has a budget. So, you know, that’s kind of a really important thing. Early on, a lot of our projects were gotten from technical contacts. So it would be another technical person saying, Oh my gosh, you can help me with this technical problem.

And then they would have to go solve the problem of finding the budget and calling it whatever and so forth. And we’ve gotten a lot. More success in recent years due to some of my colleagues who are really good at this, at sort of making contacts with more of the C suite level, more executive level, and getting buy in for kind of a makeover of the company into being, you know, data driven and things like that.

And that is a more long-term, more sustainable engagement when you have leaders bought into it.

Evan Wimpey: Yeah, that’s, that’s great. Do you think engaging at the leadership level, does that help drive sort of the implementation success? You know, regardless of technical success, but why is it 90%?

John Elder: Because some of our early projects were forced on a group.

A group would be solving a problem, but not doing well enough according to their leadership. And then leadership would bring us in as an outsider. You can imagine how well. We were initially received by that group. It takes a while, it takes a while to earn their trust and see that we are there to help them, not to make them look bad and so forth.

So that’s one issue to think about anytime something’s leader driven rather than that unit driven. But the advantage of having it being leader driven is they think in terms of budgets and, and goals and are on a little bit longer term than, you know, a project related group. So if you can win their trust and show them results and be proven reliable, they love going with people they know and trust and have sought for before.

So we are very, once we kind of do the hard work of eventually proving to a client that we can help them, we get a lot of repeat business and have very kind of positive relationships with them. That’s good. So, fewer clients, but, but more that, that shared trust allows you to be much more efficient in getting things done.

Evan Wimpey: Certainly. Yeah. That’s great. Do you have across, coming on a third decade, that’s a lot of projects. Do you have a favorite project that you’ve worked on?

John Elder: I was thinking about that because we’re, we’re moving our offices here in Charlottesville after 15 years in a nice old building. We’re moving to the other side of the mall.

Probably for better parking. I have no idea. It doesn’t have anything to do with the decision. That’s what happens when you delegate everything. But people were kind of giving me a hard time about all these boxes of notes and project files I had and we’re going to have to move them again. So I’ve been going through them.

And so I actually have been revisiting a lot of old projects and that’s been kind of fun as I cut to save the highlights. So there is one that comes to mind and it … It has a lot of things that make it very typical for us in a positive way and some in a negative way. It was a very successful project.

We were asked somebody, it was generated by someone who attended one of my courses. I do periodic courses on data science, and they sometimes generate leads or contacts. And she was a head of a group at a pharmaceutical company. Pharmacy and Upjohn had recently merged. And they were testing sort of high-volume compound testing had come out.

And they had a favorite one from there that they had taken out to A/B tests, you know, double-blind trials, both in Europe and the U.S. And they had a bunch of data from that, and they were analyzing it. And they just, it just wasn’t good enough. Their best compound wasn’t probably good enough to go to market.

And to find out if it was good enough or not, they were going to have to spend a billion dollars. They had spent 300 million getting to this point. And this was when a billion dollars was real money. This was a couple decades ago, but and they said, we’re going to meet with, we can’t do it alone.

We’re going to meet with a larger company in a month. Can you look at the data and see if there’s something we missed, you know, it’s like in a month. So, yeah, so it was kind of like hold the presses, you know, we’re going to, we’re going to work on this. We’re going to do the minimum we need to on all the other projects we’re working on and kind of have a whole of company attack on this.

It wasn’t a very large group at the time, but we didn’t know anything about pharmacology. And they said, well, we can teach you what you need to know quicker than you can teach us data science. And I thought, what? She just took a course from me. Maybe that’s a bit of a dig about my teaching.

But anyway I’ll take it. So long story short, we, we were very successful. We showed them that by looking at it fresh, changing some things that they hadn’t thought of combining information from different sources, they had a very clear and powerful and useful drug that was better than placebo, which is, placebo is an amazing opponent.

If you, if you put the placebo up against nothing, the placebo is good for everything. Like it, it is statistically significantly better than nothing. And, and so to be better than a placebo is actually hard to do. And the pharmacy scientists estimated that their, their existing blockbuster drugs, 70 percent of the effect was a placebo effect.

And only 30 percent was the drug. So, you know, it’s, it’s a little scary when you start dealing with medical information, because it’s the things you learn. But anyway, beating the placebo is very hard. Their test that they were doing was based on the way the Food and Drug Administration looks at things.

So they were, they were studying for the test. But the question they asked was not an FDA question. It was an internal investment question. Do we do this? And one of the keys about, it was a psychological, problem they were trying to address, and that’s not as quantifiable as a lot of other medical problems.

And so, what, one of, just a small thing that hadn’t occurred to them, and here we are looking at it from the outside. We have experience with a lot of other different areas. We used information from survey analysis to know that people have different set points. So, think of a pain scale from one to ten.

You know, you and somebody else might have a very different idea of what a bee sting is versus smashing your fingers in a door. You know, in terms of what those numbers are. We’ll. They were just analyzing the numbers of before and after the administration of the placebo or the drug as raw numbers, rather than scaling it per person and looking at the delta.

And when you did that, that was very helpful. Also, they had three different tests, sort of assessments that were accepted in the literature by which they might prove the thing working or not. And they were already cheating a little bit by, by trying to find, you know cherry pick which tests that they use to go forward with, to the agency.

But really the tests, although they were very correlated in their results, they actually had different information in them. Enough that it was best to use all three tests. And so we did a 3D visualization where the scaled tests were on a, you know, negative 1 is really bad, 1 is positive. Everybody starts out 0, 0 in the middle.

The points all explode outwards as the person’s being treated, either in a positive or negative way with some variation based on the test. And then we used kernel density estimation to summarize what the distributions were doing. So we have these sort of bubbles, bubbles within bubbles. You start out here, the placebo’s like a little bit of positive, a little bit negative.

The drug compound, boom, it’s kind of all moving into the positive spot. And so with that one graph and the explanation that went with it, They made a billion dollar decision. For years, we didn’t know what that decision was. But ten years later, someone at a conference grabbed me and said, I’ve got to tell you, it worked.

It was one of the three drugs that we introduced in that decade. And it’s a big success. Thank you so much. So, you know, it was, it was one of these things where we contributed to an area that we knew nothing about. You know, we knew nothing about the specifics, but we know a lot about data. We helped them look at it fresh just like they wanted, we helped them make a decision that they needed.

It was a big success, but we never did any other work in that field and we never heard about it. You know, so from a business standpoint, it was a really bad, you know, move, you know, one shot miracle and then nothing else come of it. And so that was typical of us as well. We would do projects in all sorts of different areas, very proud of it, and we would learn lots of things in those areas that then sometimes would help us.

But from a business standpoint, you really want to start to follow up on your successes and find other similar kind of problems you can solve. And luckily folks that came and joined us later were much better at that. And otherwise, you know, the business wouldn’t have been able to grow.

Evan Wimpey: That’s is a really, really exciting story. And I know immediate closure, but at least we’re 10 years past now. So you at least know something good came from it.

John Elder: Yeah. It was a lot of fun and, you know, I’m watching, I just finished the first season of the Lincoln Lawyer show last night, and on one of the episodes last night, he was talking about how he had represented a lock picker.

So somebody who’s breaking into a safe and the sound is that final little thing clicks into place. It was like. That’s what that person lived for and I think, I know what that, I know what that’s like. You know, when you finally solve that problem, that’s been bedeviling, you know, really smart people for a long time, and you’re able to contribute enough to solve it and get that thing to click into place, it’s just, it’s extremely satisfying. It can keep you going for a long time.

Evan Wimpey: Yeah, that is, that is very exciting. Sometimes you’ve got to wait 10 years to know if the safe actually opened or not.

John Elder: To know that they opened the safe. You unlocked it. What was in it?

Evan Wimpey: Yeah, yeah. And unfortunately, they, they get to keep whatever’s in the safe. They keep it all.

John Elder: Yeah.

Evan Wimpey: So that is a really interesting use case, I think, about drug discovery and pharmaceuticals. You know, in your time, you know, I think, you know, in the last half a year or the last year, large language models, have, have been, you know, the front page of not just the AI world, but really just the, the commercial and the business world.

Is there anything that, that you’ve seen come about, not necessarily in pharmaceuticals, but just in analytics where this is really cool, like this, this is something that surpassed my expectations.

John Elder: I didn’t see. Yeah, that’s a great example that and the, the just the generative AI type stuff, the, when AI was able to win at Go.

That was, that was a surprise. And someone summed up my indistinct feelings about it when they, they knew the research pretty well and the game theory pretty well and said, wow, we’re about four, five years, four to five years ahead of schedule here. And that’s kind of the sense I had too. It was like, whoa, that was a little sooner than I expected, you know.

And I think that’s the speed of improvements and pace in that arena is a bit jarring and exciting and scary at the same time. But the things that we’ve always focused on is reliability. So I look at the current ones, and I know they’re getting better, but I look at the current ones and they, I told somebody they’re like—they’re like a friend that’s trying to impress you and throwing out a bunch of facts, you know, half of which are false.

And it’s like, I don’t know really how you could use that. You know, maybe if you’re doing ad copy. But if you’re trying to, you know, we put a huge premium on the reliability of answers or of information and so, I know they are working on that, but it is amazing they can generate from a very small seed can generate a plausible large flower.

So it’ll be fun to watch and how that develops.

Evan Wimpey: Yeah. That’s good. Maybe the analogy there of can throw out a lot of facts to try to, maybe it’s like a dating profile. Maybe it can help you write your dating profile. Is there anything …

John Elder: And about a third of things are not true.

Evan Wimpey: Yeah, exactly. Nobel Peace Prize winner, Congressional Medal of Honor, got it all. Is there anything that’s just maybe on the other side of that? Like, you know, I thought we’d have self-driving flying cars by now. Why hasn’t AI been able to do this? I thought we’d cracked that by now.

John Elder: It’s interesting you bring, I, it doesn’t I don’t have that immediate impression, but on the car thing, one of the things that surprised me is, is that they allowed self-driving cars, you know, legally, like they didn’t, there’s such a litigious society in America where a lot of Europeans joke about I was in Ireland where they have the Cliffs of Moher, and they basically have a sign that says, you know, if you Go too far and fall off.

You’re stupid. Don’t do it. You know, they don’t have a fence. But that’s not the way the U.S. does everything. You know, you think you’re supposed to be in a bubble of safety at all times. So to allow these things out on the road, that kind of astonished me. It was nice from a technical standpoint, but I hope it doesn’t prove too dangerous.

I think in the long run, it’s going to improve safety to have the vehicles running and so forth. Yeah. Anyway, certainly that was a positive. I thought actually that they were being a little, allowing it to be a little bit experimental.

Evan Wimpey: Yeah, that’s great. And I really do. I like John, you mentioned sort of reliability of whatever model output, analytic tool output.

And we think like cutting edge look at the new things that has been developed and reliability feels like one of the things that’s like a stalwart and was important. Thirty years ago when you were building your first models and it continues to be important today and presumably will be as long as analytics is a field.

John Elder: Yeah, I think that knowing that the cost, the reliability, you know, the degree of accuracy and the cost of a mistake is so important anytime you address a new problem. Mistakes are almost never equal in cost. A false positive and a false negative and so forth. And so being able to kind of estimate those and use those to help make the best decision is really valuable still.

It’s astonishing how often people do percent correct, which just assumes equal weights on all kinds of errors, you know, instead of thinking through the problem.

Evan Wimpey: Yeah, I got, I got my own personal neural network can get a pretty high percent correct on a lot of problems. Guessing baselines. A lot of the folks, John, that listen to this show are out there leading analytics teams right now.

And I think about your position as a leader in this company here at Elder Research. And then I think about your, you know, your history as a very strong and creative technical contributor. And I’m curious how you think about your role. Now, and how you’ve, you’ve sort of seen that transition, is it important to stay on top of things technically?

Do you have to let go of some things?

John Elder: Yeah. I tell people, if you have a business, it’s just a process of letting go. You know, it’s you do it all at first and then you lead it all and then maybe you bring in the work and then maybe you review the work. I kind of wave at it as it goes by now, you know, it’s like I’m not really doing much technically and, and so I’m missing out on some of the fun.

But what I realized, you know, it’s funny going through these old notes, I was so technical in the beginning, I was always trying to find the best way to do all these different technical issues. That actually plays a very small role in solving someone’s problem. I mean, obviously you have to have people with the technical smarts and the more they know and can bring the best techniques to bear.

But the real problem is figuring out what the real problem is. You know, figuring out what the issues are and you get so excited when you can finally map that to a technical technique and all. But not everybody has to master everything. And so, you know, choose a few things to stay current on. And other things, I have to say, I don’t know a lot.

And that’s a little bit humbling. But you have to—you have to focus, I think, on what’s most important to make an impact. And it’s not going to have an impact if things aren’t used. So you have to learn how to be trusted. And how to have the technology be trusted, and how to listen, how to solve the real problem, how to get early enough results that the project doesn’t lose momentum or support.

And you know, these are, these are—some people call them soft skills. You know, these are, these are very interpersonal skills. And again, thinking, you know, as if you’re the other person and what could help them the most. A lot of times for us it’s technology transfer. We’ll solve a problem and then we’ll teach the client how we did it and they really appreciate that and they often bring us in the next time knowing that we’ll be generous with our information.

Evan Wimpey: Yeah, I think that’s great and I’ve got the visual now of sort of the work flying past you and you waving to it as it goes by. Is there… I guess sort of a two-part question. Is there ever a time when you feel an urge like that project that looks meaty, that I would want to jump in that?

John Elder: We had a situation where we had a lead.

Somebody had read Eric Siegel’s book and I’m featured in chapter one of Eric’s book because of our big project with a hedge fund was a big success. And early on, even before the client invested, I got permission to invest in it and my wife invested her, all of her savings in it. So this really impressed Eric and He, he used it as an example and somebody had read that book and then asked Eric to maybe help him and he said, Oh, well, I’m busy, but John Elder would be great.

And the guy was like, he’s real? Like, that’s a real person? So they got real excited about talking with me and he had this small project, but it was right up my alley because it was looking at the significance of different combinations of things. And I’m really focused on, The significance result.

I mean, if you think back to that drug discovery, Hey, we’ve got this little bit of difference. How significant it’s, well, first of all, we showed them the difference was bigger and then that the significance was really high. So at all, all the statistics, you know, kind, kind of boils down to how likely could that have happened by chance, you know, and that’s, that’s your statistics significance level.

And the way this is typically done is not good. So we’ve got a new way of doing it. Target shuffling, which I try to evangelize about all the time. But that really is the key question. And so here was a project that was small. It was too small for our commercial group to take on. So I said, I’d like to …

So I got a little programming help and we did a really … It was one of those kinds of things where the thing that I was really interested on was exactly the technology that was needed for this problem. And so it was very satisfying. And a success.

Evan Wimpey: Ah, that’s, that’s very cool. That’s very exciting. And for folks who just got the, the tease of target shuffling, we’ll put a, a link in the show notes wherever you’re listening to this, you can click there, learn more about target shuffling.

You can, we’ll have a link of John teaching you about target shuffling there. Yeah, John, do you want to ask you, I want to ask you one more question. Maybe this is sort of a lead in from the sort of ideal project that, that floated by, that you got to jump on in sort of the, the magic world where you’ve got a client who says, oh, John Elder, he’s real.

I just want to, I just want to pay John to, to do whatever work he finds. You’ll do something fun.

John Elder: Do something fun. That, you know, it’s funny because I’m kind of in that situation. They pay me here and I’m not really sure if I’m worth it. You know, before the pandemic I was jetting all over the world and teaching and more busy than ever.

And then when that stopped, I was working from home for about a month and my wife said Don’t you have some place you have to go? Okay, I got it, I got it. I’ll leave. So anyway, I’m sort of semi-retired now, so I do have a little bit of fun picking and choosing. I guess there are three things that I’m interested in working on right now.

One is, I wrote a dissertation. I created a global search algorithm from my PhD years and years ago. And I want to get that code out there. We used it for several projects early on in Elder Research. But I want it to be available to other people. So we’d love to get that code out there. And that’s a small, small doable thing.

Another thing is I love investment modeling and, I’ve had this great idea that I’ve done on my own for years where, and it’s not even my original idea, but taking advantage of just a short version of it is. You, instead of investing in an index, which is a great thing to do, you buy the stocks individually and at the end of the year you give away the ones that have gone up, taking advantage of the tax laws that allow you to, , know when to pay any tax if you give away a long term appreciated asset.

So if you’ve had it for over a year and it’s gone up, say from 100 to 150, you give it away, you get a tax deduction of 150, the charity gets 150, and nobody pays the tax on the 50 gain. So it’s a nice incentive to donate to 501c3 charities. On the flip side, if you lose money and you sell it and you realize that loss, up to 3,000 of losses can be counted against ordinary income.

And the other losses get counted against other gains. So you have, you can, and you can roll over those losses from year to year. So these losses are eventually going to help you. So you buy all these stocks, some go up, some go down. You sell these, you give away these. And you get all these tax benefits that make the effectiveness much better than just an index.

So, that’s a cool thing. I’d love to kind of make a business out of that or make it small. And that would help a lot of people. And then the, the thing that I love to talk about when I go to conferences and all is this ways of calculating significance. So, the, the ways of telling how likely the chance could have produced a result as good as you.

And this, you have to think about chance using your data science. Like, given this data science algorithm with all of its powerful search and combinatorial things that it’s trying, if the data is random, some combination is going to come out ahead of all the others. And it’s going to look pretty good compared to not a placebo.

It’s basically what you’re trying to do is create the placebo effect. So, you take a system that has the original data, and you do the best you can to get a score. Well, it’ll be very, very daunting if you said, let me scramble the data, in this case I’ve shuffled the target variable, let me change it so there’s no relationship between the inputs and outputs, and I know there’s no relationship.

So anything I find on this second run is going to be spurious. And you get a distribution of that by shuffling it several different times. You get a distribution. of the best finding that you can come from. And you compare your best finding to that. So you’ve got your best finding from the original data, and then you’ve got the best finding from all these manipulated versions of the data, and that tells you where in that distribution you are, tells you the significance of the result.

And this can, this can calibrate your finding to a probability, and can completely eliminate the problem of multiple comparisons, where somebody says, look, I rolled double six. Whoa. Was that one roll or a hundred rolls? And they chose, they cherry picked their best result. So, anyway, that simple idea is not well understood, but it would probably solve half of the, at least half of the inadvertent fraud that’s being done today in science.

Because in science, somewhere between 65 and 85 percent of all papers published in journals, with peer reviewed, good journals, is false. In other words, it can’t be replicated. And, and that, this would cut that way down. Now, the problem, one of the huge problems with it is, it would cut down on publishing. So, everybody in academia that depends on publishing is not really happy with the idea.

But if you’re actually depending on finding a real result, it’s something we should try to find a way to do.

Evan Wimpey: Yeah, John, that’s great. You, you, when you champion a, a, a new publication, you say, I’m going to publish only real results, then I’m in, I’m subscribing. Those are all very exciting. John, I absolutely hope that you find time and you.

get to get an opportunity to work on those. I’d love to follow him. If folks want to follow you, they can always check in elderresearch.com. I don’t know if there’s anything else that they can do to follow you or your work in particular.

John Elder: No, I don’t tweet or anything. Or I don’t X or whatever it’s called, except once in a blue moon.

Evan Wimpey: Awesome. We’ll put the links in the show notes. Our guest today has been Dr. John Elder. John, thanks so much for coming on the show. Thanks Evan.