Divorce at Altitude: A Podcast on Colorado Family Law

Litigation Tactics for CFIs & PREs (Presentation at the 2021 Colorado Family Law Institute) | Episode 50

September 02, 2021 Ryan Kalamaya & Amy Goscha Season 1 Episode 50
Divorce at Altitude: A Podcast on Colorado Family Law
Litigation Tactics for CFIs & PREs (Presentation at the 2021 Colorado Family Law Institute) | Episode 50
Show Notes Transcript

An integral piece of some family law cases is a report from a child custody experts or parenting evaluators.

Ryan Kalamaya and lawyer/forensic psychologist  Dr. John Zervopoulos discuss litigation tips for Child Family Investigator or Parental Responsibility Evaluators at The 2021 Colorado Family Law Institute.

About Dr. John Zervopoulos

 Dr. John Zervopoulos is a lawyer and board-certified forensic psychologist who runs a forensic consulting service, PsychologyLaw Partners, to help family lawyers understand psychological issues to better witness and expert examinations, develop more compelling case arguments, and identify themes in their cases. 

He has conducted evaluations in more than 400 court-appointed forensic cases and has been an expert witness in more than 100 cases. Dr. Zervopoulos has also authored Confronting Mental Health Evidence: A Practical PLAN to Examine Reliability and Experts in Family Law and How to Examine Mental Health Experts: A Family Lawyer’s Handbook of Issues and Strategies, both published by the American Bar Association. 

What is Divorce at Altitude? 

Ryan Kalamaya and Amy Goscha provide tips and recommendations on issues related to divorce, separation, and co-parenting in Colorado. Ryan and Amy are the founding partners of an innovative and ambitious law firm, Kalamaya | Goscha, that pushes the boundaries to discover new frontiers in family law, personal injuries, and criminal defense in Colorado. 

To subscribe to Divorce at Altitude, click here and select your favorite podcast player. To subscribe to Kalamaya | Goscha's YouTube channel where many of the episodes will be posted as videos, click here. If you have additional questions or would like to speak to one of our attorneys, give us a call at 970-429-5784 or email us at [email protected].

************************************************************************

DISCLAIMER: THE COMMENTARY AND OPINIONS ON THIS PODCAST IS FOR ENTERTAINMENT AND INFORMATIONAL PURPOSES AND NOT FOR THE PURPOSE OF PROVIDING LEGAL ADVICE. CONTACT AN ATTORNEY IN YOUR STATE OR AREA TO OBTAIN LEGAL ADVICE ON ANY OF THESE ISSUES.

Ryan Kalamaya (3s):
Hey everyone. I'm Ryan Kalamaya and I am Amy. Gosha welcome to the divorce at altitude, a podcast on Colorado family law. Divorce is not easy. It really sucks. Trust me. I know, besides being an experienced divorce attorney, I'm also a divorce client. Whether you are someone considering divorce or a fellow family law attorney listening for weekly tips and insight into topics related to divorce co-parenting and separation in Colorado. Hi, I'm Ryan. Yeah, I'm a family law lawyer up in the mountains in Aspen Colorado. Today, I'm going to be presenting with John , who will introduce himself next.

Ryan Kalamaya (49s):
And we're going to be talking about mitigation tactics for parenting evaluators, specifically CFIs and Pieris John Zervopoulos.

John Zervopoulos (57s):
I'm, John Zervopoulos. I'm so pleased to be presenting here with Ryan. I am a board certified forensic psychologist, and also a lawyer. I run direct psychology law partners in which I assist lawyers in helping them understand mental health issues and their cases. And I help them effectively manage the work and testimony of experts in helping them write deposition questions, examination, questions, motion language at the light, in the role of a consulting expert to lawyers. So we've

Ryan Kalamaya (1m 33s):
Got a presentation here on that. We're going to go through, as you can see, we're listed here. You can find some more information about us as well as a podcast that my partner, Amy Gosha and I have on family law here in the mountains. John, can you see everything okay?

John Zervopoulos (1m 51s):
Yes. Okay. So

Ryan Kalamaya (1m 53s):
The first thing is we're going to set the stage, you know, as most people would attending this, understand, you know, we're talking about parenting evaluation. So for our kind of hypothetical or situation, your client's involved in the parenting dispute, the court appoints a child and family investigator or a parental responsibilities evaluation, and you kind of are with your client throughout the whole process. And then the report comes out and everyone goes to the recommendations. They scroll through the, you know, 30, 40 pages or whatever the report is and it comes out and it's either really good or really bad. John sounds about right. For your perspective.

John Zervopoulos (2m 34s):
Absolutely. Or it looks good for the first 80 pages and the last two pages don't look so good. Right?

Ryan Kalamaya (2m 41s):
So John we're in Colorado, down in Texas right now, as we're recording this, we're dealing with a fair amount of smoke. And, but it's kind of a matter of where exactly is this smoke coming from, happens to be that all this smoke from these wildfires is coming from Oregon and other places, but can you for the audience what this Cartner slide is about?

John Zervopoulos (3m 3s):
Sure. Well, it is sort of Colorado centric, but here we have a lawyer cross examining an expert, a smokey, certainly as a, as an expert, even as God, his name on his hat, that means he's a board certified expert and he's a pining absolutely. Where there's smoke, there's fire. But the question there comes well, how do you take that apart? And know what smoke is talking about is reliable and trustworthy if he can back up and support that particular, right? So

Ryan Kalamaya (3m 36s):
The procedural options just I'll give a quick overview. If a CFI was appointed, you know, you can, and the report is adverse. You can obviously talk with your client about getting up URI and then the applicable statutes, 14, 10, 1 27. If a Pirie was appointed, you can ask for a supplemental Peary and the statutory language and citation is there. You can also consider a motion to strike as John and I will get into a little bit, you know, it's really difficult if not overly optimistic to ask the court to strike an evaluator that has appointed on its behalf.

Ryan Kalamaya (4m 19s):
A lot of judges see a Pirie or a CFI as their expert. I've certainly heard judges refer to a pre as my expert because of the neutral nature of them. And so it's a challenge. And oftentimes when you are asking the court to a point where you agree to a particular expert, it's really hard to go after them and say, you know, you should strike them, but you obviously can do that. There's some, you know, pros and cons. You can certainly get out some of the things that John and I are going to talk about, the criticisms that you may have on a, an evaluator in a motion to strike, but it is rare that they are granted. So Wells, can you do, you can engage in discovery and you can request the evaluators file.

Ryan Kalamaya (5m 3s):
John, what is look under the hood? What does that mean?

John Zervopoulos (5m 5s):
Well, you know, basically the, the report is the evaluator writes, the report is trying to put forth their best argument. They may not put a lot of negative sort of things in the report that would Fleck poorly on themselves. Looking under the hood, gets you to see what exactly the evaluators pulling from their file in order to frame their story in the report. So I think it's really important that if you're going to assess the car so to speak, you really have to look under the hood as well and see how it's running. See what is supporting the, the report's assertion. And we're going to get

Ryan Kalamaya (5m 45s):
Into specifically where the engine is, where the oil is in terms of a report. And John's going to explain some of the things that he looks for in particular, in a, in a file, but as the get back to the slideshow again, but you can obviously do written discovery to the other party and ask some questions that maybe you thought that evaluator missed and you can do subpoenas a subpoena duces tea comes to mental health providers or other people that, you know, maybe the evaluator overlooked. And then obviously you can engage in depositions and you can do a deposition of the actual evaluator. John helps draft those questions and whether or not you're setting it up for a motion to strike, or you want to understand a little bit more, that certainly something that is worth considering.

Ryan Kalamaya (6m 33s):
Then obviously you got work product expert. You can ask another evaluator to comment on and testify about the shortcomings of the rapport and any sort of work product is going to ask work. Product expert is going to ask for the file. Anything else, John, that you want to mention about these topics? Sure.

John Zervopoulos (6m 54s):
I mean, we talked a little bit before about the notion in discovery of making sure that included in the file is the billing statement of the evaluator. I see the billing statement is kind of like a skeleton or an x-ray maybe is a better way of putting it of the evaluation, the evaluator in the billing statement. I mean, they're usually charging by the hour and they're not going to give their time for free. So each element of the evaluation will be noted in the billing statement and how much they charge and how much time for each entity is also noted. So you can get a quick x-ray of the evaluation by just going through the billing statement to, and seeing what is there, how much time was spent and dates and so forth.

John Zervopoulos (7m 39s):
And I just think that is just incredibly important,

Ryan Kalamaya (7m 43s):
Get into the, the biases. But an example would be that you see a, an evaluator write a report and essentially finish, and then go through the collateral interviews. That might be an indication of a bias or some, some oversight, which we'll get into. Yeah,

John Zervopoulos (8m 2s):
Absolutely. I mean, just to quickly note that is that the, the evaluator will not put in the report that they waited until after they finished all the interviews. And a week before they wrote the report that they contacted collaterals. They'll just say in the report that they contacted collaterals, that's what going under the hood means. And

Ryan Kalamaya (8m 21s):
John, you and I worked on a case where the evaluator put dates for interviews for the parties, but then didn't put dates for the collaterals. And it was pretty clear that that was intended to gloss over the fact of when exactly that happened. But we've got this cartoon. I mean, you got a lot of information in the file and the reports and the point is how do you Wade through all these technical terms and a lot of information to really kind of get to the heart of the matter, because oftentimes it can be over whelming. So John, can you walk us what conceptual framework and practical framework, what does that mean?

John Zervopoulos (8m 59s):
Well, we're going to go through this pretty quickly as detailed a little bit more in the paper that Ryan and I put together, but there are two frameworks to consider to look at a skeleton of an evaluation. The first one is the conceptual framework. The second is the practical framework. Now the conceptual framework comes from the American psychological association guide, a child custody guidelines. They're currently under revision. So hopefully this will stay in, but I just see this as a very tidy three pronged definition of best interest of the child and what the evaluators should be looking for in their evaluation, the parenting attributes and deficiencies, the child's psychological developmental needs and the resulting fit in terms of a three-part best interest argument.

John Zervopoulos (9m 44s):
Use a lawyer to take that and organize your arguments around that. Then go through the various factors that the statutes lay out in terms of best interests, and then come back and summarize there. And you can see where the evaluation may fall short in the analysis of either the first two and whether they are accurately putting together the first two to a set of recommendations to the court about how the court should view the family from there. I

Ryan Kalamaya (10m 13s):
Mean, John, your practice is nationwide. So you're consulting with attorneys all across the country. Most states, as far as I understand it use best interest, but in Colorado under 14, 10, 1 24, you know, we're getting into specific factors, but they can either be put into this general framework. It's just helpful to understand, you know, the conceptual framework, but moving on to the practical framework, walk us through parenting responsibilities, evaluation, and essentially a parenting evaluation and report. What goes in from an evaluation into that from a practical standpoint.

John Zervopoulos (10m 49s):
Well, here you see a three legged footstool and I put them there just to picture what the general accepted methods are, but it's a nice metaphor for dealing with how good the evaluation is. And the way I look at it is that three well-constructed legs makes for a solid footstool. But if one or more of the legs are a bit wobbly, then the footstool starts wobbling itself and may even fall down. So what are the three legs generally accepted legs? The first is interviews and obviously the evaluators interviewing the parents and the children questions you might ask is how many interviews were done?

John Zervopoulos (11m 31s):
Was it just one or two, or maybe there were several where the evaluator is able to get to know the family, how bad were the interviews mostly history-taking versus really dealing with the court's concerns. And also the timing of the interviews. The next leg is a testing and questionnaires. Now not every one who does a pre as a, which means non psychologists will probably not do testing, but they will give questionnaires. All right, testing is what kind of testing was done, how the testing results were interpreted and so forth. We'll get into that a little bit later, I put the questionnaires in the second leg because sometimes what evaluators do is they substitute questionnaires for interviews.

John Zervopoulos (12m 18s):
Maybe they'll do one interview and say, well, the person answered all the questions in a questionnaire, but keep in mind that answering questions. When you're at the kitchen table with a glass of wine, with someone else filling out the questionnaire is very different than answering questions in person in the office. And the third leg, John,

Ryan Kalamaya (12m 39s):
Just to go back on the second, you and I have kind of worked through, you know, a lot of the answers to those questions and the questionnaire is there readily available on the internet? Right?

John Zervopoulos (12m 48s):
Exactly. So the third is collateral sources. What collateral sources, the evaluator consider certainly depositions and leadings and so forth, but did they talk with teachers relevant people that they should talk with to get a better sense of the family? And sometimes we talked about before that collateral, that folks will not contact collateral sources until the very end. That's a big problem. Now, John,

Ryan Kalamaya (13m 17s):
You're an author in, you know, for an ABA published book on examining mental health professionals. And this is the question that you really lead off with in your book in it's. How do you know what you say? You know? So can you explain what does that mean? Well,

John Zervopoulos (13m 35s):
Let's think again about smoking right? In our initial cartoon. I answers the question well, where there's smoke there's fire. Well, the question, the follow-up question by the Lord as well. Dr. Smokey, how do you know what you just said? What support is there case law notes that it's not so simply because an expert says it's so even if they're wearing a hat with their name on it. Okay. So let's talk

Ryan Kalamaya (14m 3s):
About the plan. You have the plan model, and you got to go through this in a strategic manner. I think that resonates with a lot of lawyers. So what's your plan and John, what does it mean when you're dealing with the plan model?

John Zervopoulos (14m 20s):
Well, it's a systematic way of being able to critique, take apart, understand the evaluators work and their testimony. Usually as lawyers, what will oftentimes happen is we will get this long report, 80 pages, 90 pages sometimes longer and start cherry picking because we don't, it's just too overwhelming. And what you need is a plan to manage that. And the plan is an acronym for psychology law analysis model and emphasis here is that you're not just looking for the psychology part of what the expert's saying, but you're also integrating that with case law so that you can make your compelling arguments to the court that combines both the law and also the psychology part of the analysis.

John Zervopoulos (15m 9s):
So before we get into the plan model, though, we have to set the table and there are two issues to deal with. The first issue is touchstone words. And here we have reliability equals trustworthiness, you know, reliability. That notion has been around for 25 years now, since starboard came out and there's a lot of baggage around the notion of reliability, what does it mean this that so forth, but in a footnote in Dobber, it equates reliability with trustworthiness. Can the court trust what the expert is saying sufficiently to base their decisions about the family on that? And I encourage lawyers to use trustworthiness with reliability, back and forth in both their motions and their arguments.

John Zervopoulos (15m 52s):
So the court is really focused on what we're really talking about is can I trust this testimony? The second issue is to look at the testimony or your examination from two key perspectives, the legal perspective and the psychological perspective, the legal perspective relates to case law statutes, rules of evidence, things we all should know as we're dealing with experts. The psychological perspective draws from the ethics practice guidelines and the professional literature of psychology and the mental health literature. And it's kind of

Ryan Kalamaya (16m 27s):
Like, you know, the, when the law is against you, you argue the facts when the facts are against you, you argue the law. And this is just a reminder that you need to keep both. I think a lot of lawyers when they approach some of these evaluations, they can get mired. And we'll talk about some of the pitfalls of really digging into the test data and like trying to, you know, wrangle with an expert on the test data, or, you know, can you just really focus completely on the law and your point as both, you know, a board certified psychologist as well as a lawyer is that you really have to look at it from both perspectives and know those two's perspectives to effectively engage in litigation when a mental health professional is involved.

John Zervopoulos (17m 12s):
Right? So the next, there you go. So in the way I look at it is that you have to know both the legal and psychological perspective separately and jointly separately. You have to master both, right? But as Ryan was saying, if you just stay with a psychological park, you're going to get mired down in testing and methods and that sort of thing, just the legal part, you're not going to be making the connection. It just happens that there is both case law and statements from the APA ethics code that shows how you can jointly use these case laws from Dahlberg and Dobber informs Colorado rules of evidence 7 0 2.

John Zervopoulos (17m 54s):
And the statement there is an assumption that the experts opinion will have a reliable basis and the knowledge and the experience of his discipline. That is this one of the expert. Then look on the other side, psychologists work is based upon established, scientific and professional knowledge of the discipline. They mirror each other so that if you find deficits on the psychological part, you've got the language there to walk back and find the mirrored language in the case law. And you can put those together to make some compelling arguments. And in the paper, we talk about a couple of other ways that you can do that as well. So let's get into the plan model and we've got four quadrants.

John Zervopoulos (18m 34s):
Step one is testing the expert's qualifications. Step two, examine the methods, reliability, step three, evaluate the reasoning, reliability. Step four, gauge the connections between the conclusions and the opinions. Each one of those steps by themselves can be pretty compelling arguments in either supporting or critiquing an experts issue put together, make up a nice way to in a structured way to look at whether smokey knows what he's talking about. So let's look at the first quadrant expert qualifications. Here's a statement from Shrek I trial court should consider whether the witness is qualified to a pine on such matters.

John Zervopoulos (19m 18s):
Now that's struck as an admissibility case, but I contend that, and there's arguments that Trek lays out a bunch of the Dobber factors and other factors as well to consider. And I consider those basically tools in a toolbox to offer you the lines of questions that you can use to examine and cross examine experts. And this is just one of those. So you look at a CV in my view, it's important to don't get overwhelmed by the CV, treat the CB as the experts, brochure, not going to put a whole lot in a CV. That's going to be negative about themselves. They're trying to present themselves as acceptable to the court and to others.

John Zervopoulos (20m 1s):
You'll also want to look at the training and work experiences of the expert. You know, not all experts who testify in PRS or CFIs are the same. They may come different backgrounds, different levels of training and so forth. You have psychologists, you have sometimes psychiatrists, LPC, licensed, professional counselors, social workers, marriage, and family therapists. It's important to know what their background is, their training to give you an insight as to what they're bringing to the table. As an expert,

Ryan Kalamaya (20m 37s):
This episode is brought to you by our law firm. Kalmia Gosha Amy and I describe our law firm as an innovative and ambitious trial team that pushes the, to discover new frontiers in family law, personal injuries in criminal defense in Colorado. We currently have offices in Aspen, Glenwood Springs, Edwards, Denver, and Boulder. If you want to find out more, visit our website, calamansi.law. Now back to the show. So John, we've got the key professional organizations, but to piggyback on what you just went over and you can explain these organizations, the example would be that you need to kind of go through the CV and, you know, pick out has this expert.

Ryan Kalamaya (21m 25s):
If it's a parental relocation case, have they written anything about, you know, parental relocations, if it's domestic violence, if that is the key component to the evaluation, has the evaluator does how much experience or how much training have they engaged in on that particular topic? And that's really gonna drive your evaluation or rather your examination. So if the expert is someone that is favorable to your client, you're really gonna try to highlight and bolster that evaluators credibility, and you know, the reliability and the trustworthiness you want the judge to know this evaluator has experienced and knows what they're talking about, but John, what are these key performance key professional?

John Zervopoulos (22m 14s):
Well, we're looking at the American psychological association and AFCC the American association of family conciliation courts. I think these are key professional organizations for folks doing a PRS and CFIs. Not only because it, most of those folks belong to one or both of these organizations, but even more importantly is that each of these two organizations also publish guidelines and standards for how to conduct child custody evaluations, how to do brief evaluations, guidelines for forensic psychology and so forth. And the way I look at those guidelines is that even though a mental health professionals say, well, they're just guidelines.

John Zervopoulos (22m 56s):
The way you can look at it, as lawyers is to see those guidelines as basically generally accepted and peer reviewed guidance for how the evaluator should go about conducting their work. These guidelines are gone through committees. They're reviewed by experts. They're finally voted on by the boards of delegates of these organizations. So these are pretty compelling documents that you can use to critique the evaluator and knowing that the evaluator belongs to one or both of those is it can be very good in terms of giving you lines of questions about appropriate, generally accepted methods of conducting evaluations.

Ryan Kalamaya (23m 38s):
You know, just a shout out for my partner, Amy Gosha. She did a presentation with judge Arkin last year for the family law Institute. And I believe that those standards were part of the materials that they provided. But, you know, John, you have the standards and professional guidelines in terms of how you evaluate, and this is perspective that you bring to the table.

John Zervopoulos (24m 1s):
Exactly. I mean, failure to comply with these codes or guidelines is powerful evidence that the reasoning and methodology may be invalid. I mean, there's still guidelines, but again, there are guidelines that are generally accepted in my view, peer reviewed and so forth. So if the evaluator is going off the rails by doing things that may not comport with guidelines, they don't have a sufficient rationale for doing so that's a problem. And keeping this in mind can help you focus on that. And

Ryan Kalamaya (24m 34s):
John would an example be that these guidelines will provide some standards by which, you know, whether you interview a child in terms of how old the child is, if an evaluated or for example, doesn't interview or speak with a 15 year old, that is subject of an evaluation, the standards and guidelines say something

John Zervopoulos (24m 54s):
About that. And what we talked about before in terms of the conceptual model that came straight out of guideline, three of the APA is child guidance. I mean, child custody, evaluation guidelines. And if the evaluator didn't adequately address the parent's strengths or deficits or the child's strengths and deficits, that's something you can point to not just to in your examination questions, but say, you know, this, isn't a set of guidelines that you could have addressed differently and better. And that's one way of using that.

Ryan Kalamaya (25m 26s):
Okay. So what's step two of the plan model.

John Zervopoulos (25m 29s):
So step two is the methods reliability. And we just went through that before. So I won't repeat what we talked about, but you know, you have data from all these sources, the interviews of testing and questionnaires, the collateral sources, how do you put together? And that's the ring there. That is the reasoning part. The evaluator has to be able now to use their judgment and their reasoning to put this together, to build a story, their story of the family, that hopefully you can help to incorporate parts of into your story that you're arguing to the court. It's a step three, step three is reasoning reliability.

John Zervopoulos (26m 12s):
And that would previously was the segue to that. And they're really four points to reasoning reliability. First of all, distinguish conclusions from opinions. This is so important lawyers. Oftentimes don't do this define conclusions as data that comes from social science stuff. For example, if mom has the depression scale on, on mom's MMPI is elevated, mom cries a lot. Collateral sources say that she's very sad for whatever reasons mom is depressed. Okay. But the opinions come around the recommendation just because mom is depressed. Does that mean her time with the children should be compromised? Well, maybe so, depending how she deals with that, but that's a different question here.

John Zervopoulos (26m 55s):
We're looking at conclusions is what comes from the methodology that came from step two and keep in mind that these steps are build from each other, the qualifications to the methods, and now to the reason, okay. The second point for reasoning reliability is to understand conclusions, right? Conclusions are making inferences a good dictionary definition of inferences conclusions based on evidence and reasoning through oftentime experts tried to make themselves look so very scientific by saying the evidence shows this that so forth. The fact is, is that they're taking evidence that may be supported by the literature, but they're also using reasoning to develop their story and not seeing Todd lab and book fooled by randomness wall street guy.

John Zervopoulos (27m 49s):
But he made a statement that just stuck with me since I read it on the way back science lies in the rigor of the inference. All right. So we're dealing with inferences there now to highlight that, you know, oftentimes we look at test results or experts may look at test results, almost like a physical x-ray, let's say there's a broken bone and the physical x-ray shows the break, but psychological testing is not like that. Well, what our test results even MMPI test results. Well, here's a statement from the standards for educational, psychological testing, coauthored by the American psychological association, and two other prominent testing organizations when making inferences remember that word, we just talked about, about a test takers past present, and future behaviors and other characteristics from test scores, the professionals should consider other available data that support or challenge the inferences.

John Zervopoulos (28m 46s):
Again, you know, we're looking at not only the notion of inferences, but that's also talking about, remember the footstool and you have to look at the collateral data, the interviews, and so forth, they all kind of start melding together as the expert is reasoning themselves towards a story that presents their view of the family. Yeah,

Ryan Kalamaya (29m 4s):
John, I think that there's certainly a point in my career where when I was first dealing with these parental evaluations, that it was very easy to say, well, you know, a lot of people will come in and say, well, the other party is bipolar or the other, party's a narcissist. I think I probably hear that. And about 90% of the cases and, you know, statistically it's around 10%. And so, but you know, it would be easy for me to be like, well, there's an objective test and you just fill out the form and the bubbles and we'll figure it all out. And would you agree that that's just an overly simplistic way of looking at, you know, those sorts of test data? Absolutely.

John Zervopoulos (29m 41s):
I mean, again, these are inferences, not just quote evidence, remember evidence plus reasoning is the conclusions definition of inference, but you have to put all together. And that's what that slide mentioned. Okay. The third point in the, in part three reason reliability is rely on Joyner's analytical gap tests. Joyner is the second case of the us Supreme court Dobber trilogy, and the applicable quote, the relevant quote is a court may conclude that there is simply too great, an analytical gap between the data and the opinion proffered, keep in mind here, we're talking about inferences again, it doesn't say that there should be no gap because there's always gaps, right?

John Zervopoulos (30m 26s):
But rather the gap cannot be simply too great for the court to consider it as admissible and reliable testimony.

Ryan Kalamaya (30m 36s):
And we'll get into this later on. But the point of an attorney is if the evaluator is generally favorable to your client and you like that opinion more the opinions you want to narrow that gap and that you really want to look at the file and have the materials to try to narrow that gap. And your examination is going to be conducted as such. The obvious counterpoint is that if you are opinions and recommendations are against you, you want to try to find the gap and make it as wide as possible. So the judge and everyone's just sitting there scratching their head, how did you reach that conclusion or not that conclusion, but how did you reach that recommendation given the conclusions and the data that you considered or did not consider

John Zervopoulos (31m 26s):
And notice too, that that gap metaphor provides you a nice metaphor for your arguments to the court, right? And it's a good mindset to keep in mind. And just like the three legged footstool, here's another metaphor that you could use to guide your arguments to the court as to the trustworthiness of the expert's testimony. So the fourth point here is among other ways, and we talk about that in the paper evaluators, hide analytical gaps in the conclusions when they don't actively monitor judgment biases and several bias and judgment biases that may affect evaluators, confirmatory bias, a rush to judgment.

John Zervopoulos (32m 10s):
They have a notion of how things should be, may be based on the first interview, or maybe even based on a conversation they had with both of the lawyers before starting and suddenly all the data is funneled through that rush to judgment, hindsight bias. I call it 2020 hindsight. You know, life is messy. If a parent does something that seems somewhat outlandish and maybe showed some poor judgment as we look back it, well, maybe a little bit of, I don't want to say forgiveness, but at least some cushion to understanding what may have been going on there would be helpful. Hindsight biases looks back at what happened and tries to say, well, this is what the person should have done.

John Zervopoulos (32m 58s):
And if they didn't do it that way, then they truly are lacking availability, biases. What I call top of mind bias, maybe the evaluator went to a workshop on domestic violence or went to other kinds of workshops on testing and so forth, or they've read some issues in the paper and all these issues are top of mind. And then they get into the evaluation that they are court ordered to conduct. And rather than trying to monitor their bias, what happens is that the flavor of the month or the issues that are in their mind, given their experience, recent experiences become the funnel through which they deal with the data.

John Zervopoulos (33m 44s):
And that can be a big problem. Of course, overconfidence bias. This is the hardest bias to deal with and the easiest bias for experts to fall into I'm right, because I know I'm right, this is what it says. And you know, the research shows that we shouldn't be so convincing to ourselves and to others and try to lock in what a particular assertion should be. That's overconfidence. Okay,

Ryan Kalamaya (34m 14s):
John, then we've got, you know, the reasonable alternative explanations. So can you tell our audience, what is this mean meant to do in terms of biases?

John Zervopoulos (34m 24s):
Well, this is what I call the bias Buster. All right. Reasonable alternative explanations is keeping the evaluators should keep an open mind to the various explanations of the data that they are taking in throughout the evaluation, as they are taking the data in. And if they are actively doing that, then they are testing different explanations against each other course. If you know, one example might be, if you're doing a crossword puzzle, I do crossword puzzle every day. I enjoy that. And sometimes I'll write in a definition or what have you. And I feel pretty good about it, but then things aren't working out well, instead of just moving forward with it, where I just erase the definition and fill another spot in, in nearby, and suddenly the whole thing changes, right?

John Zervopoulos (35m 15s):
That's an example of keeping your dealing with reasonable alternative explanations. We're now in step four, which is opinions and recommendations. And here's where you take the conclusions from step three and apply them to the concerns of the court and the legal standards. Here's what I feel is the best definition of anywhere. And in fact, it really highlights what we've been talking about up till now, recommendations are based upon articulated assumptions, interpretations, and inferences that are consistent with established professional and scientific standards. And you'll notice that the beginning of that sentence recommendations through inferences is really the psychological perspective, right?

John Zervopoulos (36m 2s):
That are consistent with established professional and scientific standards is the legal demand. And so this does so much in terms of pulling together everything we've talked about for the plan model and also shows how jointly we consider both the legal and psychological perspectives when we make demands of experts in terms of their recommendations.

Ryan Kalamaya (36m 24s):
Okay. So the next 10 minutes to close things out, we're going to go through some tips and thoughts for examinations when you are in court or in a deposition for a case that you've got a pre or a CFI. And specifically, you know, if you've got kind of a battle of the experts, a CFI and a pre or a period and a supplemental pre, or you kind of go to the loan and you need to really bolster that, that evaluators, credentials, and trustworthiness. And the first thing is you have to understand what exactly are you trying to do? And you have to find what your client's narrative is and looked at the things that you are going to highlight.

Ryan Kalamaya (37m 6s):
So the strong or weak legs, if you know, that really matters, if you want to really bolster and evaluators recommendations and evaluations, then you're going to really focus on the strong legs on the counter is that you are going to really attack or show the weak leg in an evaluation and try to narrow, or I'm sorry, widen that, that gap. So if there's a CFI that has not done, you know, testing data, you can end a pre kind of comes into the case. And does the testing data that might be an example of the CFI has a weak leg when it comes to the three legged stool compared to a peer.

John Zervopoulos (37m 47s):
Yeah. I would agree with that. And you know, another way of explaining that as well is, you know, Ryan, as you noted in terms of your client's narrative, you've got a story that you are trying to convince the judge to persuade the judge about your clients have made about making your client's case, but at the evaluator, let's say comes out against your client. You know, how do you deal with that? And obviously one way is to look at the three legged footstool and see if any of the legs are wobbly and start addressing that. But you can do that and show the negativity or the deficits of the evaluation.

John Zervopoulos (38m 27s):
And then after you deal with the methodological deficits, go back through the evaluation, not cherry pecking, not picking, but finding areas where the evaluator says positive things about your client, and then have read them back to the evaluator. And basically what you're doing is using the cross examination to tell the court your story. You're adding to the materials that hopefully the court will listen to even as you're cross-examining right.

Ryan Kalamaya (39m 0s):
And the one of the other points is the qualifications is, I mean, lawyers, judges, we kind of ended up falling into this trap of, well, you know, they went to this law school or they want it to this university and wrote these papers. We have a particular, it's only human to put people into stereotypes and that hierarchy, and, you know, you want to be careful not to do it too much, but don't just overlook that CV. And if the other party, you know, whether it be an onboard ear or during director cross, you might want to delve into that and highlight specific training publications or research if your case involves in measurement or alienation.

Ryan Kalamaya (39m 41s):
And that's obviously a loaded word, but domestic violence, drug abuse, substance abuse, you really want to ask yourself through that plan model and then think about how are you going to present that examination when it comes time to go to court, we mentioned the professional standards, having a working familiarity or having someone like John help you out in terms of, you know, do, does in evaluation, does it have a weak leg because there was no interview of a significant other, a boyfriend, a girlfriend, or a parent, or, you know, in the context of what shortcomings are there in the evaluation. Anything else John, that

John Zervopoulos (40m 19s):
Another thing comes to mind too, is, you know, let's keep in mind that these people are court appointed. You know, the judge appointed them in the first place, which means that may be part of the goal. Certainly not for everyone, but for many of these, if you try to go after the evaluator, like you're going to nail them, you know, in like a 30 minutes, the courtroom thing, the judge probably won't go for that. So what is your point? Not only to use the evaluator, critique him enough to show the judge that he falls short and then use the other parts of the evaluation to tell your story, but keep in mind with the judge that you don't want to necessarily show the judge that the evaluation is absolutely no.

John Zervopoulos (41m 3s):
Good. You want to nudge the judge towards your side? So the judge feels safer making a ruling that might not totally comport with what the evaluator recommended. So there are several different audiences that you are trying to deal with when you're doing a cross. And of course, on direct the same thing, right? So

Ryan Kalamaya (41m 27s):
In cross-examination and in particular recognizing and challenging biases, these are something. So if you confirmation bias or you look at that CV and you see that the professional just wrote a paper on domestic violence and then, you know, domestic violence, you didn't see it as a, particularly a significant factor in your case, but all of a sudden, the whole report, or is all about domestic violence, that might be an opportunity where you really cross examination, cross examine, you know, the professional and really highlight that and bring that issue. Same thing with the billing statements, confirmation bias, or you look at the file and we'll talk about looking under the hood again, but when you really highlight, a lot of judges have heard about these biases and you can get into not attacking personally the professional, but say, you know, it's understandable.

Ryan Kalamaya (42m 19s):
This is that we're all, everyone is subject to biases. And it's a way that sidestep the personal attack where, you know, you don't want to get up here and say, you can't handle the truth. You know, like a few good men, that's just not going to work. But if you kind of tell the narrative of, or really use lean into a bias, you can say, you listen, you know, they're doing the best that they could, but they worked. They came into this, you know, bias.

John Zervopoulos (42m 42s):
Another way of handling that too is remember we talked about considering reasonable alternative explanations. Well, your client's story is a reasonable alternative explanation. So you can use elements of your client's story and in your examination of the expert, see how the expert dealt with those alternative explanations. And you can both show, you know, strengths and weaknesses of the expert, depending on whether you're direct or cross. But also again, you are showing biases by showing that the expert did not consider a key alternative explanation, the one that you and your client are presented to them, right?

Ryan Kalamaya (43m 25s):
You know, the point of avoid getting bogged down and test data, you know, most evaluators, if you ask for the test data, they're not going to give it to you. They're going to have to give it to a licensed mental health professional, someone like John, that can evaluate it. But I think it's really easy for, you know, you're going to be on a losing battle field. If you're trying to lock horns with an expert, talking about test data, and most judges are just going to glaze over when you're talking about specific raw data scores, versus it just, isn't going to really matter. And you don't want to lose sight of what exactly you're attempting to do and that's to present your client's story. So when you really get into the test data, you know, you can get really far in the weeds of looking under the hood.

Ryan Kalamaya (44m 9s):
John, what do you mean by looking under the hood and the PRS or the CFIs

John Zervopoulos (44m 13s):
File? It's important to find out what the basis is, what the experts saying in the report and testify. All right. Again, the report is a, is an account of the expert's story of the family. And they're going to say it in a way that supports what the, how they see the family, whether that support is strong or not so strong, but looking under the hood enables you to say, okay, this is where the expert got this from. But you know, in that same interview, the parents said something else that is relevant and maybe qualifies a bit about what the expert is saying, why didn't the expert account for that?

John Zervopoulos (44m 56s):
And again, it's like, you know, you're going to buy a used car. You're not just going to buy it because it looks nice and so on and so forth. You're going to take it to a mechanic and say, what does it look like underneath? And it's the same sort of notion their billing statements. We talked about that a little bit before, again, it's the x-ray or skeleton of the evaluation, whatever metaphor you want to use, but it shows time periods and dates for everything that the expert charged for, which gives you just an immense amount of data to put together

Ryan Kalamaya (45m 33s):
In the kind of final minute here, we've got, you know, smokey the bear. And, you know, you really got to ask yourself just because someone has smokey on their pat and they characterize themselves as selfless as a period or a CFI. You really have to ask yourself, you know, as the evaluator, how do you know what you say? You know, and hopefully we've at least presented one or two things that can help you in your practice as well as the evaluators. How can you present and do your work to help families move on? John, thank you for the insights. It's always been helpful. And if you haven't checked out, John has some great books on the American bar association, and then he's available for consulting.

Ryan Kalamaya (46m 17s):
But thanks again for the time John

John Zervopoulos (46m 19s):
Enjoyed it. And I enjoyed it as well. It was great. I wish I could be up in Colorado, but maybe next year, indeed. Thanks all. Thank you.

Ryan Kalamaya (46m 28s):
Hey everyone. This is Ryan again. Thank you for joining us on divorced at altitude. If you found our Kips insight or discussion helpful, please tell a friend about this podcast for show notes, additional resources or links mentioned on today's episode. Visit [email protected] Follow us on apple podcasts, Spotify, or wherever you listen to it. Many of our episodes are also posted on YouTube. You can also find Amy and [email protected] or 9 7 8 3 1 5 2 3 6 5 that's K a L a M a Y a.law.