ROUGH DRAFT 9-15-11, Outcome Measures for Centers for Independent Living – An IL NET Resource Presented by ILRU >> Hold off on starting our video for about 2 minutes while I speak to the room. Can you do that? >> I can pause. >> It’s already live? That is okay. Just leave it. Don't pause. We'll just talk to everyone and it will be just fine. >> okay. >> I think we're at about that time. Good morning everyone. Welcome to day three of outcome measures for centers for independent living with our trainers Mike Hendricks and Bob Michaels. We have covered a lot of ground. Done a lot of interesting activities with the audience. I want to share what we have been doing as far as technology is concerned. As I ex-we began I explained this is our second beta testing of the live video cast. Although the first went off without a hitch, we have had some flaws for this one. You probably saw a lot of flurry IN THE BACK OF THE ROOM WITH THOSE OF US IN THE BACK TABLE TYPING FURIOUSLY AT OUR COMPUTERS AND SCURRYING AND TALKING TO THE AUDIO VISUAL PEOPLE. THAT IS BECAUSE DURING THE DAY YESTERDAY, THE HOTEL'S INTERNET CONNECTION FAILED. NOT JUST THE ONE FOR OUR ROOM. THE CONNECTION FOR THE CHIROPRACTOR'S CONVENTION. NEXT-DOOR AND THE BUSINESS ACCOUNTS IN THE HOLY. IT WORKED INTERMINUTANTLY THROUGH OUTTHE MORNING AND FAILED IN THE AFTERNOON. WE SPENT A LOT OF TIME AND WERE PROBABLY DISTRACTING BACK THERE AS WE WERE TRYING THE RESOLVE THE PROBLEM. WE APOLOGIZE TO THOSE OF NEW THE ROOM FOR THAT. MOST ESPECIALLY WE ALSO APOLOGIZE TO THOSE OF YOU ON LINE FOR THE INTERRUPTED SERVICE. THE TEAM WORK LATE INTO THE EVENING, THEY HAVE RESTORED, NOT RESTORED THE LINE BUT ACTUALLY PROVIDED AN ENTIRELY NEW LINE FOR US TO WORK WITH. IT SHOULD BE STABLE. WE SHOULD BE ABLE TO CONTINUE THE DAY. THE VERY GOOD NEWS. EVEN THOUGH IT WASN'T ALL BROADCAST, EVERY BIT OF WHAT WE DID WAS RECORDED. AND WE WILL IN VERY SHORT ORDER, EVEN MORE QUICKLY THAN USUAL, HAVE THE FULL VIDEO POSTED INCLUDING COMMISSIONER RUTLEDGE'S PRESENTATION. ALL OF THAT WILL BE POSTED TO THE WEB EARLY NEXT WEEK. ANYONE HERE IN THE ROOM CAN COME BACK. THOSE OF YOU WHO MISSED PORTIONS WHO WERE WATCHING ON THE WEB, WILL BE ABLE TO GO BACK AND GET THE FULL PRESENTATION. WE'RE EXCITED ABOUT THAT. ALL THAT IS RECORDED. WE HAVE IT. IT'S LOCKED IN. SO WATCH FOR AN ANNOUNCEMENT FROM US EARLY NEXT WEEK THAT WILL ANNOUNCE IT'S AVAILABILITY AND THERE WILL BE A PAGE WITH NOT ONLY THE VIDEO CONTENT BUT ALL THE ASSOCIATED MATERIALS. YOU WILL BE ABLE TO SEE MIKE AND BOB IN PERPETUITY, WHICH I'M SURE YOU WANT TO DO. WE DO. >>. I WANT TO ONCE AGAIN THANK BOB AND MIKE FOR THEIR PRESENTATIONS AND CREATIVITY AND FOR PUSHING US TO DO SOME VERY HARD WORK. TO THANK DARRELL JONES AND HER TEAM, CAROL EUBANKS WHO DID THE WORK OF PUTTING THIS TOGETHER, TIM FUCHS AND HIS TEAM FOR LOGISTICS, DON AND KEVIN WHO HAVE HANDLED OUR PART OF THE TECHNOLOGY, AND FOR THE AV TEAM. WE APPRECIATE YOU TOO. FINALLY, YOU KNOW, THIS OUTCOME MEASURES WORK IS GOING TO TAKE SOME COURAGE ON YOUR PART TO PUT IT TOGETHER AND MAKE IT HAPPEN WITHIN YOUR CENTER AND FOR HOW WE AS A FIELDWORK TOGETHER TO COME UP WITH SOME MEASURES THAT HAVE MEANING ACROSS CENTERS. MAYBE IT WON'T TAKE THE KIND OF BOB KAFKA CHARGE HELD WITH A CUP OF COFFEE COURAGE. YOU KNOW, BUT IT WILL TAKE COURAGE AND WORKING WITH YOUR STAFFS AND CONVINCING THEM AND YOU’RE BOARDS OF THE IMPORTANCE OF DOING THIS. IT WILL TAKE PROBABLY HARD WORK MAKING DECISIONS ACROSS CENTERS AS TO WHAT THINGS WE WANT TO REPORT TOGETHER. We want to be careful as we said about not getting into a situation of trying to compare centers to one another when the circumstances within centers may be so dramatically different that it's truly inappropriate to do so. Still being able to report overall to constituents and funders and others that demonstrate the true value of independent living will take so very hard, thoughtful, careful but very important work. I feel like we are in a place in the movement, those of you in this room because you're here and you're interested and because of some of the things you have said, lead me to believe we have the right people having the discussion. We have the right people who are committed to that. So I encourage you to, as Justin would say, lead on. Take on this important issue. I wish you the very best as you do it. And now Bob and Mike. >> MIKE HENDRICKS: Richard, before you go away, I wonder if I could ask you to do us a favor. You mentioned Darrell Jones. I wonder if we might get her to come up. >> she is so shy. But why don't you come up, Darrell. She is grumbling, I can hear all the way from here. >> MIKE HENDRICKS: For the people who Daryl, you probably have a since of how important she has been to these two and a half days. For those who don't, first of all, Darrell was our boss. Bob and I worked for Daryl on putting this together. Lots and lots of phone calls and e-mails and conversations. And if the sessions held together at all, if it made sense, if the material seemed to be helpful, probably 80-90 percent is because of Darrell. I think we should thank her for that. >> (Applause). >> MIKE HENDRICKS: Richard, perhaps you could also ask Renee Cummings to come up front. >> Renee. >> MIKE HENDRICKS: Darrell and Renee have something in common I bet nobody in the room but me knows. >> I know. >> MIKE HENDRICKS: What is the date today? Renee? What is the date today? >> it's the Ides of September. >> MIKE HENDRICKS: Do any two women we know happen to have birthdays today? >> (Applause). >> MIKE HENDRICKS: I'm an awful singer, but I think we ought to sing happy birthday. (Singing). >> MIKE HENDRICKS: Thank you for letting us embarrass you. Anybody else having a birthday today, by the way? Let me mention a couple things. These fall into the category of responding as best we can. Remember yesterday afternoon we brainstormed some nice, I thought, specific concrete things you could do when you go back to your CIL to start down this path? It's already on the Wiki. It's already been typed up, posted to the Wiki, and it's inside that session yesterday. I think challenges, opportunities and first steps. When you get back to your CIL. Maybe in a few days you decompress. You ask yourself, what was it I might be able to do to start off? You can go there and have the good ideas from yesterday. That is one thing (will adjust margins at first opportunity). . >> remember on Tuesday evening you gave us some extra questions or things we needed to cover a bit more or whatever? Also people, we have a couple from our online friends that we put in here too. We have taken these and we have folded them into our last session today, which is going to be called pulling it all together. We're going to address as many of these questions as we possibly can, as best we can. What I want to say, if you have others, other questions, jot them down now. That will be our time, last chance really to talk about some stuff. Any other announcements by anybody? Things we should say before we jump into stuff? I'm really, how do you stress the importance of the last session in a training program? How do you convey to people that even though it's the last session, please don't think it's the least session. This to me is in some ways maybe, I don't want to say most important, certainly as important as anything else. This is when we talk about using the outcome information. We have done all eight of our yellow brick road steps. Here we are, we have the data. I'm going to be honest with you and say I think in our field test, this is where we have been the weakest. I think in the outcome measures field in general this is where it's the weakest. People tend to think that, well, you know, of course we'll use it. There's no question. We're absolutely going to use it. Don't even need to consider how we're going use it. It will happen automatically. Problem is it doesn't happen automatically. Especially doesn't if you don't think about how you're going to using it up front. So I wanted to just as best I can fumbling here, but as best I can, say gosh, this is an important topic, okay? Let's really remember the whole reason we're doing this is to use it. . To do that, maybe we would go back, like maybe this is the first slide we ever saw maybe on Tuesday. What is the difference between outcomes management and outcomes measurement. Remember we said measuring is that research activity where we measure performance and report findings and that is it. Gosh knows there's a bunch of outcomes measurement going on in the world. But outcomes management takes that next step. Remember? It encourages a program systematically use that performance information to learn about its services and improve them. I was so pleased when Daisy was talking yesterday about they had a quality improvement team. I think that is what it was. Quality improvement team. They must do exactly this. They are trying to use the information to learn about services and improve them. Why is it happening, how can we make things better. That is what we want to do. Remember. We saw this graph, this picture here. We do it because we want to first increase our effectiveness and tell our story. I love this quote. This is the guy, he was Rudy Giuliani's police commissioner. If you follow the outcomes field, you know New York City, especially the police department was one of the pioneers in measuring, watching it, and doing something about it. Obviously nothing is perfect, but this was a pretty dog gone good system. They actually did affect crime rates quite a lot. I think it was called city stat. Anybody know for sure? I think that might have been their system, something like that. And here was this quote, which I love. No one ever got in trouble if the crime rate went up. They got in trouble if they didn't know why it had gone up and didn't have a plan for dealing with it. Now that is outcomes management. You can't really always affect what is happening, but you can affect how you react and respond and try to make it better in the future. I think that is a great quote for what we are trying to do here. I would say, as we all know, there's two ways to use this stuff. First one is outside. Absolutely. We live in a real world. We know we live in a real world. We have people who are going to care how we are doing. We have to have some public relation, obviously. No doubt about it. Let's not kid ourselves. One way to use it is outside our CIL for PR value. Sure we want the look good to many different audiences, we want to keep our funding, hopefully even increase it. We want to recruit talent, staff and volunteers. We want to promote our CIL to potential clients and referral sources. We want to encourage other agencies to collaborate. Absolutely. That is important stuff. I know that is important. But there's a second way to use it. That is inside our CIL for program improvement. This is where we are weak. This is where I think we're really weak. We're much better, we all know how to paint a picture of what we're doing. We may not always have the raw aim a.m. munition, but we know how to do it. This is part we aren't as good at as a field. Inside, know how effective we're being, find ways to be more effective, help staff to focus on what is important, identify training needs, support both short and long range planning. These are the different ways we can use it inside. That is good to say. Exactly what does that mean? What does that mean to use it inside our CIL? How are we going to go about it? Here is something I made up to show the difference between outcomes measurement and management. First two stars above the dotted line are the outcomes measurement. Yes, we're going to measure progress on key outcomes then report outcomes to funders and others. Please don't stop there. I'm just saying, please don't stop there. Please go below the dotted line to the next four things. Here are the four parts that we need to be able to learn how to do. We're not so good right now. Our task force, all of us. Understand why outcomes are as they are. Identify possible changes that might help. That is brainstorming possibilities, right? Decide which of those possibilities we're going to actually implement. And then implement those. Of course we would be measuring again. Let's take each of those, if we can. Just ask ourselves how might we do each one. First one. Understand why the outcome is what it is. First thing to know, this is not a computer issue or matter of running some different analysis. Or running the numbers in a different way. No. This is done by people. People in your CIL have to figure out why the outcome is what it is. These how are we doing meetings with staff are an awfully good way to start that process. I would say a good thing to do would be to include all levels of the staff. It's amazing where different insights can come from different levels. Each voice being equal and being told and recognized that they are equal, no matter their status within the CIL. Really value honest discussion. Obviously, the kind of question you want to have might be something like, well, let's take one from yesterday. 72 percent of our people, what was that? Got the information they needed. 72 percent of our people got the information they needed from on I&R. That means 28 percent didn't So why do we think that might be. What do we think the problem might be? Brainstorm all those ideas among your staff. Also have focused groups with clients. Ask the clients. Why didn't you get it? Or individual interviews as you like. But find out from the clients why they think the outcomes are as they are. I wanted to put how else. You may have another idea. Maybe if Daisy is here, she has an idea of how they do it in their CIL. Anybody have another way of how you do it to ask yourself, why is this outcome the way it is. Anybody else have another idea for how to get that the? Bob is going to actually get you into some scenarios in a bit and you're going to practice at your table. It's going to be relevant in just a moment. Okay, I don't know any others. That is two at least. That is the first part. Understand why the outcome is the way it is. Second is you have to generate possibilities for good changes. Generate some possible good changes. There are techniques to do this. I hope with our task force we can move into putting some of those techniques out there for you. Because we're starting this slide now into organizational development. We're out of kind of evaluation and sliding into organizational development and kinds of techniques and tools they have. They have quite a few. You want to identify the biggest barriers and hindrances. Try to look inside the findings. Remember yesterday we walked through the analysis and said are male mentors doing better than female? Parents who attended versus didn't? What I mean by looking inside the findings and seeing if there's some hints that can be helpful. Brainstorm with a diverse group of people. What might be a good change to our program. Again, focus group with clients. Ask the clients, how do you think we ought to maybe change the program to be more effective. Learn good practices from other CILs and agencies. I don't honestly know if the CIL community has a way to share among yourself some like effective practices that you all tried. Like somebody hit up on something that really really works well for something. Do you have a forum, an electronic forum or something by which you share that? Is it just more personal contacts? How does that work? Sorry? I don't know how that works. Is there not a forum? Yeah, Pat. Use your Mike, please. If there isn't, maybe there ought to be. Please identify yourself, sorry. >> sorry, Pat from Georgia. One of the things that we do in our state is that all center directors as well as the board members that participate in the quarterly meetings. We have reports from home where it's what I stole from Bob Michaels at the brag and steal sessions so people have a chance to brag about something from home so others can steal it. That is at the state level. >> MIKE HENDRICKS: Nice. >> I think there's a lot of work on IRLU's website that allows people from the country to learn from one another. Then the mentoring programs. Both April and NCIL have mentoring programs to help centers that have problems. >> MIKE HENDRICKS: So there are some. I hadn't heard that phrases brag and steal. I like that. >> where did you get it, Bob? >> MIKE HENDRICKS: He likes to brag and steal. >> stole it. >> MIKE HENDRICKS: I like that. That is the notion of maybe, let's just look 2 years into the future. Carol has something from the online friends. If we look 2 years in the future, let's say that there's a outcome, let's make up a outcome. Let's make up a outcome that maybe in the future all CILs might be measuring. Something we aren't measuring right now. Come on, come on, you have had your coffees. Outcome that your CIL might be measuring. Every CIL in the country might be Americaing. >> number and percent of customers who are employed. >> MIKE HENDRICKS: Number and percent let's say of customers who are employed. I can see some kind of a system where people who are highly successful on that for some reason, you know, exactly how they do it gets captured and shared with other people so others can see if that applies to their situation. Something like that. Learning from good practices in other CILs. Using electronic list servers passively or actively, just monitoring what other people are saying or getting into it. Then maybe some other ways. Sorry, Carol, yes. >> two comments from the field. First is from Kathy and subject is focus groups. She says why use focus groups? I seem to remember earlier this week you said focus groups weren't a good way to go. Then we also have a comment from Cheryl --. >> MIKE HENDRICKS: Let's take that first if we can. What was the name, Kathy? >> Kathy. >> MIKE HENDRICKS: Thank you, you're right. Weside said earlier in the week when you're collecting outcome information, collecting come information, how is a customer doing on this, how is a I&R caller doing, focus groups are not a good way to do that. I did say that, and I meant that. And that is because why? Anybody remember why? >> you want independent. >> MIKE HENDRICKS: You want information from each consumer or I&R caller to be independent and not influenced by another person. Whereas focus group deliberately influences, and that is the purpose of doing a focus group, you do influence each other. . >> absolutely, I did say I would not use focus groups to collect outcome information. This is a different task. Let's go back Both this chart, here I say focus groups with clients. Also here focus group with clients. This is a different task. This is not collecting outcome information. This is trying to brainstorm some ideas, some thinking. That is exactly what focus groups are good at, really good at. So if you wanted to say, what are some possible ways I can make my service better in this area, that is what you want people to do, brainstorm and stimulate each other and help ideas. Focus groups would be excellent for this. So it's a matter of what you're using it for, Kathy. Don't use it for collecting outcome information. Do use it when it's appropriate to use it. Thanks Carol. >> one more from Cheryl. Topic is other waste to get feedback. She writes, using social media, Facebook, survey monkey, twitter. >> MIKE HENDRICKS: There you go, those are great additional ways. Thank you for that. Nice to have this online interaction with people, isn't it. Quite nice. Now that you have generated possible good changes, and here is another quote. I love this. A great idea is a job half done. Problem is is this guy's football team has just been awful. I think he might need to apply this to himself. Anyway, great idea is a job half done. Coming up with the idea. Then you have to figure out which of those changes are you going to make. Say you brainstorm five different ways you're going to change your program to try to do better on this outcome. You can't do all five, okay? You have to figure out what you are going to do. Generally you can't do all five. It's pretty straight forward, right? Discuss carefully all reasonable options. Be satisfied with ratcheting up outcomes. By that I mean incremental improvement is fine. In our field incremental improvement is good. We don't have to triple our success overnight. Let's just ratchet up our outcomes. Reach a consensus in the group. You can probably think of other ways. That is the third. Here is the fourth. Actually implement the changes. Whatever it is you decided to do. You brain stormed possibilities. You picked out the one or two you're going do. Now you actually do it, right? I think you know better than I probably how to implement. Just be clear exactly who needs to do something differently. Be clear exactly what needs to be different. Be clear when the changes need to happen. Identify and allocate resources needed. Monitor to make sure changes are occurring. Let me back up to this. Here is the key slide. Don't stop at the top two. Just measuring progress on key outcomes and reporting them. Do the next four steps. Doing the next four steps is the difference between outcomes measurement and outcomes management. Understand why the outcomes are as they are. Identify possible changes that might help. Decide which changes to implement. Then implement those changes. Pretty straightforward stuff. Pretty obvious stuff. Often not done. Often not done. Bob is going to lead us through, lead you through, a chance to work on this exact kind of, oh, sorry, we have a question. Please. Hi, I'm jo, here in Oregon. Question is about outcomes measure. I have semi background about doing surveys. Surveys you want consistency. So we compare 1 year to the next more or less. You can never change, just means it's kind of the approach that is frequently taken. Doesn't sound like that, but my question is about outcomes. Do you change outcomes frequently or do you try to have the outcomes so telling that it's not necessary to change the outcomes but you might change the way you measure them? Do you understand what I'm asking? >> MIKE HENDRICKS: I do. Excellent question and I appreciate it very much. I think what Joan is talking about, over time the consistency of what you are doing. Are you measuring something in year one, then the exact same thing year two, exact same year three and four, so you can actually see how you are doing, getting better or worse or whatever. And I think we all can realize that would be nice, wouldn't it? That would be really ideal, really nice to be able to do that. What it means, you have to be smart enough in year one to know exactly the information you need, exactly how to get it. We're not always that smart, are we? Sometimes we learn like in our field test. We probably changed. We started off with 12 indicators. I told you yesterday, day before, we quickly realized one just wasn't going to work, so that left us with 11. Of those 11, we probably changed three or four, maybe even five between that year and the next year. We didn't have to change them. We could have stuck with the same ones. That would have been nice and consistent. The problem is it wouldn't have been good. You know? So we said we would rather start doing something better. We opted for being better than being consistent. That is not a good strategy over the long haul. You don't want to change every year, every year, every year, but you also don't want to stick with something not working. It's obviously a balance, Joan. You're certainly right with the idea. The idea is to try to find what really is giving you the information you need and stick with it for a while. Absolutely. Yeah, it's a balance. Great question, really an important question. Any other thing like that? Then Bob >> BOB MICHAELS: All right. >> MIKE HENDRICKS: These people know how to do this >> BOB MICHAELS: So now it's your turn. . He says maybe I turned this off. I did. There? >> there you go. >> now we're going to talk about your turn. We have a series of two scenarios here. What I'd like you to do, at your table, take time with each one of these scenarios. Go to one at a time. Discuss, talk about it. Talk about it on also in terms of, not just here is what it means, but how we would resolve this problem. How would we like at it further within our organization. Do we need to look at it further within the organization. Maybe this indicates that we did a really good job. So it's up to you. Talk about it at your table. But take about 15 minutes or so on that. First scenario is you're reviewing the results of your outcome measures survey and discovered your center is placing 100 percent of your institutionalized consumers into the community. How would you use this information? At your table, again, take time, talk about it. Try to decide what it is you would do. How it is you would resolve or address this issue. We'll come back around 15 minutes. Online people can do the same thing. We'll check with you. Any comments you would, you would send those in. We'll discuss those as well. (Table exercise). . . >> Now we are at 15 minutes. Lets go ahead and get back together. We were moving to the different tables, decide who it is you want to speak. Have the microphone ready. Carol, you said you have people on line. >> Do you want me to read both? >> BOB MICHAELS: Sure, first one first. >> We have something from Cheryl regarding scenario number one. She has this is a fantastic finding. Need to market and write it up in annual report. Need to figure out how to determine dollar figure on how much the states save with the institutionalization versus institutionalization, and use figures to get more money to help more people because ultimately we will save the state and taxpayers and individuals will be more independent productive members of society. Second one from Joanie says, the information can be used in marketing, used in applying for housing grants for developments. Also it would make me ask how long did they remain in the community. Why they didn't stay for those who it was short term. >> BOB MICHAELS: Okay. Very similar in terms of this is a good opportunity to take the information we have and use it in a positive way. Do you have another viewpoint? . >> Yes, we do. >> We had a really great discussion at the table. One of the things we thought was really looking at that 100 percent and looking inside the CIL first before we took it outside to make sure this was something that was a good thing before we tried to take it outside and sell it. We were concerned that is this really addressing unmet need. Maybe there is only one person. Maybe there's a lot. Who knows. And we had --. >> BOB MICHAELS: Nobody knows whether there is one, a hundred percent is one or three. >> Right. >> BOB MICHAELS: Or 100. >> First was to understand why the outcomes are as they are. We really wanted to look before ee took it out. Then we had another dialogue about the kind of comments that just came in about saving the taxpayers money. I think we'll probably save that unless you want to get into it now. >> BOB MICHAELS: We'll be discussing that. Okay, other comments? Julia? . >> Charlotte North Carolina. We decided, we said this is our life. We said on Tuesday that if you get a 100 percent, you need to change your, you need to either go out of business or change your model. Because it's not possible. This comes back up and we go back to that. I can tell you what I would do as executive director if this came across my desk. I would gather my staff together and say how did you define institutionalized and consumer and community. I think we define the terms wrong. You know, if institution, if community is we moved from eight bed to four bed ward, that would be a different result than if we moved them from a state institutionalized nursing home into an independent apartment. So 100 percent would be a flag to me. I would not brag to anybody. I would think we had the terms wrong and were not being hard enough to ourselves under the definitions. >> BOB MICHAELS: I think the terms are right. Hopefully we are defined what it means to be an institution, what community base is and all that. We defined all those things. >> The first question we would ask, were we all consistent. There are staff members who think that they are going to get a raise if they give me a high number. >> BOB MICHAELS: Somebody else. What else might have happened? Somebody else's hand is up. >> We were thinking just because a consumer has transitioned to the community doesn't mean that was a successful transition. Doesn't mean that they are thriving, that they are having their needs met. A hundred percent is great. Again, that could be one person or a hundred people. Looking into that finding to see what that looks like. More than that, how are the customers doing and addressing those. Celebrating our success I heard from another table, internally. Really being able to validate and affirm internally to be sure the number all line up. >> BOB MICHAELS: Again, they might have applied the definitions incorrectly. Basically what we are asking is, at this point they were in an institution, at this point they were in a community. And if that followed the definition, that is fine. But how else could they have, could a hundred percent be off? Why else? Anybody want to take a shot? >> Back here. >> BOB MICHAELS: Sorry. >> Karla Larson, Tulsa, Oklahoma. It could be how you are determining. You may not be letting anyone in who is going to be difficult to place into the community. >> BOB MICHAELS: Right. Back here. Our center is across the street in Rancho amigo rehabilitation hospital. We share a lot of customers. We wouldn't even consider those consumers, they are institutionalized but we wouldn't consider them as part of our deinstitutionalization program. Because they are not ready. They are working on medical things in order to get out. They will eventually. It's not our job to get them. The hospital is responsible for doing that and we work closely with them. When I see the hundreds percent, I would say, okay, is this a hundred percent of those coming to us for deinstitutionalization, is that what we are measuring. Or are we measuring every consumer that comes into our program that hopefully we are getting some people that are living in nursing homes or other programs that are considered institutional settings, who maybe don't even want to come out. We promote choice and there are truly some people out there who would prefer living in those settings or smaller institutional settings. So I would, I don't know, we were commenting about this being negative. We can't possibly be at a hundred percent. There's always people in the works. We need to be going after more people. If we have reached a hundred percent, there has to be more that we can go after. >> BOB MICHAELS: Right, right. Anything over here? Any questions? Comments? In back. >> Kevin wolfer ton, eastern Oregon centers for independent living. This table had the same conversations like everybody else did and kind of, you know, looking at why we're at a hundred percent. If you take the information that was given and just how we can use that hundred percent, we came up with some ideas that we could share the successes with legislators, other CILs. Again, this is a side from questioning the hundred percent, but how we could use that hundred percent. Also for outreach and grant writing. Those are some of the ways we discussed using that information. >> BOB MICHAELS: Yeah. So if you had this, if you got this situation and you looked at it, then you would take a look at your organization, and you want to find out what that hundreds percent means. Does it mean one, does it mean three. One of the things that I was telling you about yesterday that we ran into was this was especially true of the centers in Pennsylvania because Pennsylvania has had a program for 20 years now on deinstitutionelization. What the centers have to do is actually go into, every nursing home and evaluation every consumer. So they have really a good idea about who wants to move out and who doesn't want to move out. They have a big list. So you have some centers that might get 30, 40 people out in 1 year. But they have got a list of 120. >> Right. >> BOB MICHAELS: They have a 25 percent success rate. Yet this center is a hundred percent. They wanted to have three. So you don't know what it means. One of the things we have to do then is take a look at what is going on within the organization about that. And before we start going on bragging about our money, we want to make sure we know what we're talking about. Right? These are great ideas. I think that is where we want to go. Both people on line and in this group are saying this is where we want to go eventually, explaining to people how successful we have been in making cost effective change in where the consumers live. But we need to take a close look at what it means first. Final comment on this one. >> Yeah. Pat pucket from Georgia. Maureen said this from the table. We are thinking that the order things needs to be flipped. The uses, how you use the information. First thing would be the internal. Second thing would be the external. Make sense? Maureen said it. >> Makes sense. >> MIKE HENDRICKS: I agree. >> (Applause). >> BOB MICHAELS: That was easy. Now the second one. Okay? Again, if everybody, people on line, if you would let us know what you think, we would be interested in hearing that as well. Second scenario. Inspite of your near peer support program, you discover when reviewing survey results that only 10 percent of consumers are advocating on their own behalf. How would you use this information? Again you have about 15 minutes, then we'll debrief. (Table exercise). >> BOB MICHAELS: You're all seeing what we're doing here while we're here and the people on the webcast are kind of giving u they are able to see part of this. This whole thing is going to be available to you starting on Monday is what we're saying. There's a lot of coordination. Richard and Mike were pointing out some of the problems, some of the things today earlier about people that are involved that are here. There's one person who is not here who is a key person in making this happen. Her name is Sharon Finney, and she is in Houston, and she is coordinating all this from Houston. All this activity goes back and forth, things that show up. Camera and all the stuff happening in terms of the picky getting information immediately on Wiki. Sharon is responsible for all that. >> Yesterday who said Houston we have a problem? . >> BOB MICHAELS: Houston we have a problem. We just want you to know this requires a lot of coordination also from her in Houston. Thank you again to Sharon. Okay, comments on line. Carol, if you would read to us. >> We had the first comment from jolene on scenario two. She said, I would hold a focus group to determine how the peer groups could be better served or trained. I would survey peer group members regarding knowledge of self advocacy, then provide training, as well as offer staff to practice with or model advocacy with peers. Then another comment from Cheryl who says, look internally. Review with each staff in the peer support program their approach, knowledge of advocacy issues, what their consumers want or need, and if the staff is educated on the issue or providing resources to the consumer about advocacy issues. Determine how staff reported these outcomes. Did they account for all customers doing advocacy? Are they staying in touch with their customers on a regular basis to know this information? Is staff being supportive of the consumer in providing peer mentoring. Determine if ILS is needed to educate consumers on issues. Does the consumer need self esteem/confidence building training? Is the CIL providing activities for advocacy issues or systems advocacy. Is there an advocacy group of consumers established and functional. Bring in other advocacy groups to discuss their issues, hold forums where consumers will have an opportunity to be involved. Brainstorm with staff for ideas of how to increase. Come to a consensus on how to educate and increase, assign duties, implement changes. >> BOB MICHAELS: One thing we know about Cheryl, she types. Thanks, that is great. Joanie, wonderful ideas. Anybody else want to add? What did you learn here? You have 10 percent of consumers are advocating on their own behalf in this peer program. How about the two scenarios. >> That is a very good question. This is Jody from Charlotte. Very good question. We went back again to definitions. Hour consumers, if you ask them do you advocate on your own behalf, would probably say no. If we said, are you sticking up tore yourself? They would say yes. So if it's built into your peer support program that the goal of the program, whether they know it or not, is customers, for them to start sticking up for themselves, then that would be inherent. But if you're peer support program doesn't include self advocacy skills, you geo decide whether you care about that or not and go back and look at it from the beginning. >> We had a similar look at the definitions. One of the things we focused on what a meant by a near peer support program. How long has it been up and running would be important. A month or 6 months or a year. What type of goals are established within that system for advocacy. Sorry, Barry from Portland here. In our center peer support systems are not focusing so much on advocacy per se as a main component. If that is a goal, then you can evaluate it. If it's not, we have advocacy groups, activities. People tend to think they either work on self advocacy with their ILS one on one or they go out and are involved in a community advocacy change systems, work on my behalf because I'm part of the system that needs to be changed. They may not think in terms of if I'm in the mere support system. >> Lou Anne kuby, can as. Actually it's what he said. >> (Laughter.) . >> BOB MICHAELS: If you were executive director or part of this quality team that gets together and you looked at this data, what kind of questions would you ask yourself? ? Who do you want to look into it? >> One of the things we did talk about was where was the starting base. >> Yes. >> Did it start at 1 percent? Then a 9 percent increase, because peer support is so much more than self advocacy, a 9 percent increase in those advocating is actually pretty good. If you started at 10 percent and ended at ten, we said there's a problem, obviously. >> Kansas City. For both of them I see a hundred percent, I see 10 percent, both those numbers are going to cause me questions. If it's a high number it's going cause questions and if it's a low number it's going to cause questions. So to me both those things are things that I would look at and want to know the answers to. >> BOB MICHAELS: There's relationship between the 10 percent and the peer program. How do you know. How would you find out if the program is having an influence. Maybe none of the people in the program were also members of the people interviewed. Thoughts? What would you tell staff to do here? What else would you do here? Hmm. We have already identified some possible changes that might help. So are there key things that you would do first before anything else here? Yes. >> Dan Tesseler, Alabama. Going back to page 8, one of the first things is to understand why the outcomes are as they are. What questions that would be, I think we have to think about that. But I think that would be one of the very first things I'd want to do. >> BOB MICHAELS: Yeah. If I were doing it too, and I agree with that. If I were looking at it and I had this information, I would say okay, what does 10 percent, as some people were saying in here, what does 10 percent mean. That doesn't sound very good to me. But is there a relationship between that program, maybe that program isn't addressing, maybe doesn't talk about self advocacy. Maybe that is not part of their curriculum. So there are a lot of things that you would do in starting to look not immediately assume there's some kind of a relationship between the peer support program and self advocacy. You may have decided at some point that you wanted, that that is what you wanted to come out of the program. May have been when you set up the program, you said this is one of the goals that we're going to have. But in fact, they may not be teaching that information. People that are in that program, even if they are in, may not be receiving benefit in it, may not be getting that training. All this is stuff that you would want to look into and begin to really know and understand what the problem is. Rather than just assuming that, oh, our program must not work very good. Look, we're not very high on a percent. You don't know. You don't know that at at. Again, what you will be doing is taking a look at what works and why, what didn't work, and why and that will. Really understanding. This kind of data you get, even though it looks bad or good, one of the things we're trying to talk about here is that it all can be deceiving. If you really want to sit down and understand, you do like they do at access living. What Daisy was talking about. A group of people sit down, analyze the data, find out what it means, then make decisions about that. Somebody else? >> Yeah, Bob. I was going to say, reviewing survey tool maybe need to be looked at. You may be asking questions that are not about being an advocate. Maybe do we define the questions? It's on the tool that you're using. Maybe the tool doesn't work. >> BOB MICHAELS: Maybe the tool doesn't culturally fit in the community. >> I'm Lynn hatfield from Illinois. Maybe the surveys were only asked to new consumers and they are just not in that place of advocacy yet, they are starting out. So who are you sending the survey to. >> BOB MICHAELS: Yep. Yeah, that is the kind of information you need to find out. And that is what you look into. So just getting the data is not enough. Data will tell you here is something I looked into and need to make changes on. Okay. >> MIKE HENDRICKS: I realize we're kind of starting to talk a bit about the implementation of that new peer support program. That makes me think of the training that we did together with Darrell's help back in Atlanta, maybe in January of this year for the CIL Congress, I think it was. I mention this because there's some good information. We thought it was pretty good at least. On the Wiki that you may not know is on the Wiki. We spent half a day on how to assess the implementation of your program. Sometimes people think there's something going on out there. Like you're saying, maybe you think you're serving everybody but you're really serving just four newest customers. It's really important to know what is actually going on out there. What is the there. You know, we spent half a day on assessing implementation. There's actually some suggestions for how you might do that on this Wiki. We spent half a day on assessing progress on your objectives, which is an awful lot of what we're talking about here. Then we spent half a day on measuring consumer satisfaction, which I heard someone say yesterday. We had some suggestions for that. So all of those ideas for what they are worth, are also on the Wiki. Darrell, am I correct? I don't know how we would be listed on the Wiki. Is it under CIL Congress, January 2011? What is it listed under? Find yourself a microphone and educate us all, please. >> Okay. Sorry. >> MIKE HENDRICKS: Too much birthday cake. >> All those candles. The Wiki has two major sections. One is for CILs and one is for CILCs. If you go to CILC section, it is laid out in subsections that include the 2011 CILC conference that would have all of the video segments. So you can view the training. You can download the PowerPoints, access all the handout materials. >> MIKE HENDRICKS: I'm correct, that is when we did that training, right? That was CILC Congress this year? >> Correct. >> MIKE HENDRICKS: Okay. >> BOB MICHAELS: It's 10:30, time for a break. We'll see you sharply at 10:45. Before we take a break, however, Kelly wanted to make an announcement. >> Yeah. Kelly Bucksman with input. For the people here, we wanted to be sure everybody knew the NICL board meeting will start here today, 1 o'clock. So if anybody is interested in hanging around and sitting in on the NCIL board member, you're all welcome to do that. >> BOB MICHAELS: Tim has something as well. >> Quickly, I really hope you all will stay until 12:30 but realize the reality of the situation. So if you are going to slip away to catch a flight, please please please leave the evaluation forms on the center of your table. You guys remember when we did the pretest survey on Tuesday, I mentioned there would be a post test survey. There's also that more traditional satisfaction survey. We need to find out what you learned and also what you thought and did you enjoy it. Those are in your packets already. If you, please, if you're going to be here until the end, wait until the end. We still have more to learn and talk about. But if you are going to slip out, please leave those on your table. Thanks. >> BOB MICHAELS: See you at 10:45. 1