MICHAEL HENDRICKS: Everybody okay today? Yeah. Well, I had fun yesterday. I enjoyed it. A fellow stopped me in the hall last night, maybe this is a non random sampling, one person stopped me in the hall, but that's okay, and said that he thought what was good about yesterday was that we had given you a tool. You didn't have to use it or you could tweak it or do whatever you want, but at least we had tried to give you a tool to use. He thought that was helpful. So I appreciated hearing that. And we are going to try to do that again today because actually I am really quite excited about this morning this is a topic that we have never talked about, ever. Not last year in Las Vegas, not on any of the webinars, we have never talked about measuring consumer satisfaction together. So I am really, really interested in hearing what you think and what you do and various things like that. So it is a brand new topic which is exciting to me. And as you can imagine, based on what we talked about yesterday with the three legs of the stool, this is the third leg, evaluating consumer satisfaction. So let's start off, if we can, just remembering your role in this. And I am just going to look down the first column, by the way, just to remind you. Yes you do have in your booklet there a set of all of these slides, it is called just like it does up there evaluating consumer satisfaction. The front page looks just like this. So I am going to look down the first column here and lets ask ourselves, okay I'm a SILC, what's my role in evaluating consumer satisfaction, or what do we think it is. Well, remember we said yesterday that we think it is your role, by yourself pretty much, to convene the stake holders, get them involved. But your role with other people, to plan what's going to be done here, your role all by yourself, to coordinate, make sure it is all happening, and then we are not at all sure, not at all sure if you are going to be involved in gathering the needed information here, this is the consumer satisfaction information or if you are going to be involved in compiling and analyzing. We know you are going to be involved in discussing and interpreting it and deciding what steps to take to improve the weaknesses. We don't know if you will be involved in implementing the improvements, and we know if there are any amendments to the SPIL you will be involved in that. So you are pretty darn involved in this; right. This is something you are going to be involved in. And let's look at the requirements. This is required by RSA. And here is the language. The state plan must establish, here again it is not that would be nice to; we wish you would, may, no no. The word is must. The state plan must establish a method for the periodic evaluation of satisfaction by individuals with significant disabilities who have participated in the program. So you have to do this. We all have to do this. And as if that's not enough, there's a second thing that has been established that says you also have to do this. In the CIL Standards and Indicators number 3. It reads: 'Each CILs annual report needs to show that the CIL gives consumers the chance to rate satisfaction and gives the results to the SILC.' Now that is interesting, that second part is the part that affects you, right? That second part says the CILs have to be giving consumers a chance to rate satisfaction and giving the results to you. So I will be -- when we start discussing this I will be very curious to know what you are doing with that. What you are doing with that information because you are supposed to be getting it. So bottom line, what's the bottom line? Your state, your state, I should emphasize that word, your state has to measure consumer satisfaction and your SILC has to be involved to some degree, no doubt about it. Okay. So it is the third leg, it is consumer satisfaction and this just summarizes it. It is required by RSA, every state is already doing this to some degree but in very different ways, so what we did and now we are going to present some new data that you have never seen before. We did a little survey of three states that we think are doing different things in consumer satisfaction. We want to show it to you because it is a format then frame work for us to talk about what you might want to do in your own state. So if you have your handouts there, I would strongly suggest you turn to them because it will probably be hard to read on the screen all the detail we have learned from this little survey. Actually I did this survey, it was a phone interview with three states and was really, and then I reviewed some documents that they sent me. So it was really quite interesting, very interesting to me. So you can see we are anonymous here, we are not naming the states, A, B and C we are calling them. What we found was that these three states and I'll bet it applies for all 50, that they survey consumer satisfaction very differently and on the next page, they analyzed consumer satisfaction data differently. But let's go back to the first page. They survey consumer satisfaction differently. You survey consumer satisfaction differently. So let's look at the first line here, the key aspect. How is the survey administered? You can see that state A does it through telephone interviews, and states B and C do it through mailed questionnaires. Who mails the survey out if there is one, of course it isn't relevant for A, but look at the difference between B and C. For state B, each CIL mails out the survey and for state C, the SILC itself mails out the survey. What percent of all consumers are surveyed? Well, in state A, each consumer is surveyed some time or other, at different times and state B they are all surveyed all at once. And in state C, just the closed cases are all surveyed and a sample of open cases. So look at the differences there. How often are they contacted? It varies by CIL, every three years and each year. So a lot of variation there. Who develops the survey form? Each CIL, all three partners and all three partners. So there's a variation of who develops the actual form. Is an outside consultant used? No, no, yes. So one state uses an outside consultant, and you can see the outside consultant receives the data, enters, analyzes and writes the report. The other two states don't use an outside consultant. How many questions are asked? You can see generally 11 to 12, 12 and around 15. So roughly that's the number of questions that seem to be involved. Does it ask any demographic data? In other words do you care if it is a male or female who is replying or do you care if it is a person of one disability or a different kind or age? In the first state, no they don't ask that. In the second state they do ask, they care about gender, age range, race and disability. And in the third state, they care about disability, type of program, and whether they're living where they want to live. So there's differences there. So what are the open-ended questions? These are questions where a person doesn't just check a box but they write something in. Well, in state A it varies among the CILs. In state B it is simply 'comment' because that's all. And in state C they ask the questions how the services have helped, are other services wanted and what can we do better. So a range there of the kinds of open-ended questions the different states have. So that's how the states survey. Yeah, a question over here. Hang on for the mic. Get it on. AUDIENCE MEMBER: Okay. In these surveys, are the questions about the core services etc. that are offered at the CILs or are the questions about independent living needs period. MICHAEL HENDRICKS: They're not about needs. Remember these are surveys of consumer satisfaction. AUDIENCE MEMBER: Consumer satisfaction of what? MICHAEL HENDRICKS: Of services, their experience with the services. AUDIENCE MEMBER: The services of the CILs. MICHAEL HENDRICKS: Of the CILs because that's where they would be getting the services. But as you can see we ask different things in different states, don't we. We are not uniform, are we? We are not asking the same thing in all of the different states. AUDIENCE MEMBER: How do we know that? MICHAEL HENDRICKS: Cause that's what this table shows is we are asking some different things. Let me go to the next one and we will see. Okay. We also found an interesting thing that after they get the data, the states analyze the data differently. So for instance, who enters the data from the survey. In state A, it is each CIL, in state B, the SILC itself, isn't that interesting? The SILC itself enters the data from the surveys as does each CIL. So there is kind of a duality there if you will there or duplication a little bit. And in state C it is a consultant, an outside consultant. There's three completely different ways to enter the data. What's, ah, hahahah, this next line is really interesting, response rate. Let's stop for a second because we will come back to this in a bit. Are we all familiar with what response rate is? We really need to be because it will be an important part of our discussion this morning. So somebody want to try to give a good definition of response rate? Who's willing to take an effort at that? There you go. AUDIENCE MEMBER: The rate at which the people you send that survey to or hand that survey to return it. So if I send out 100 surveys and I get two back, my response rate is 2%. MICHAEL HENDRICKS: Excellent, excellent. How does that differ from sample size? You're good. Look at this. AUDIENCE MEMBER: The population is all of the numbers of people who have received services in this particular instance, as usual it depends, all of the people is the population, and the number of people that you send the surveys to is the sample size. The number of people who return them are the number of people who return them but the ratio between those two is the response rate. MICHAEL HENDRICKS: Excellent. Stay with the mic for a second. Let me ask you to give a number. AUDIENCE MEMBER: Geez. MICHAEL HENDRICKS: So you said you sent it out to 100. AUDIENCE MEMBER: Correct. That was my sample. MICHAEL HENDRICKS: Give me a number for your population, all the people that have received services. AUDIENCE MEMBER: We will just make it real easy, we have a thousand people in the population. So n- of a thousand. MICHAEL HENDRICKS: You are so good. So if a thousand people have received services within your state, then that's what we call just as you say, we will call that the population. But maybe we don't want to send out a thousand surveys. That's a lot of money or maybe make a thousand phone calls. That's a lot of work so we will do some smaller section of that which is called the sample. AUDIENCE MEMBER: Sample MICHAEL HENDRICKS: Sample and you said you are going to do 100. AUDIENCE MEMBER: Yeah. MICHAEL HENDRICKS: And if we do that correctly, we will talk about this later, if we do it randomly then we can trust that these hundred represent those thousand pretty nicely. But then, you are saying maybe not all 100 will send it back. AUDIENCE MEMBER: Correct. MICHAEL HENDRICKS: And you said that maybe 2 send it back. AUDIENCE MEMBER: Correct. MICHAEL HENDRICKS: And if only 2 send it back? AUDIENCE MEMBER: The response rate is 2%, your sample size is 10% of the population. And if my zeros are going in the right place, that would mean that your response rate was? is it .2 or .02% of the population? MICHAEL HENDRICKS: The response rate is just the percentage of the sample that return it. AUDIENCE MEMBER: Right, right. MICHAEL HENDRICKS: So this is really important for you folks, may I suggest because later on you are going to come to dislike me intensely. I just want to prepare you to know why you are going to dislike me intensely. AUDIENCE MEMBER: So could I ask the room to do something right now? MICHAEL HENDRICKS: Sure. AUDIENCE MEMBER: Tell Mike you really like him right this minute so that later when he gets there. MICHAEL HENDRICKS: We love you, Mike. AUDIENCE MEMBER: When we find out how much we hate him. MICHAEL HENDRICKS: This will be your last chance. Thank you for that. So yes, in this case, if you sent out 100 and you only got 2 back you would have a 2% response rate. Now let's go up here and see in real life, what your colleagues are getting as a response rate. What is the overall response rate? Well, for state A, remember this is the state that calls people up. It is a lot easier when you talk to them on the phone to get them to talk to you than to send back a questionnaire. So theirs is pretty high. State B, what is the response rate for state B? 13%. What's the response rate for state C? 16%. This is why you are going to dislike me intensely in a bit. Let's just remember those numbers. Keep them in mind. Who analyzes the data once it does come back? Each CIL, each CIL or the consultant. So a variation there. How is overall satisfaction calculated? We are not sure how state A does it. State B it is an average of 12 questions, and in state C, it is a percentage who strongly agree or disagree or agree on each item. Here's an interesting part. Are closed-ended questions cross-analyzed? That sounds really nerdy doesn't it? But it is basically like if someone says overall I am very satisfied. And then maybe you had also asked them how satisfied are you with the staff you work with? It would really be nice to know well the people who didn't like the staff, are they less satisfied than others? Does the staff really matter, you know? That seems not to be being done. I am not seeing that being done in any state at all. Are closed-ended questions cross-analyzed? No, no, no. That's--I'm sorry to see that. That's an opportunity missed may I say. How are open-ended questions analyzed? These are the ones where you don't check the box but you actually write something in. Well, I frankly was kind of surprised and not so happy with this either. In one state, one of the CILs compiles the verbatim response, just compiles them. Just lists them out one after another, nothing more, just lists them all out. In the second state, nothing at all is done in the statewide report, and in the third state they're compiled, again just list, and list and list them and some times divided into groupings but some times not. So we really could do a better job of that. People are telling us good stuff when they write something back to us. We are not really digging into it enough, I think. Are demographic data used in the analysis? No no no. That stuff we ask them about, you know, are you male or female or what is your age range or which disability do you have? We are not doing anything with it. We are not looking to see if certain people are more satisfied than others. No, no, no across the board. Who gets the state-wide report? In state A there is no state-wide report. In state B everybody gets the statewide report, it goes to the SILC, the DSU, the CILs, the SILC website and a press release. In state C, it goes to the SILC, the DSU and the CILs. The last one is how does the SILC use the findings and I just put question marks across there because I really didn't get a sense from the SILC, not to be disrespectful in any way at all, I really didn't get the sense it was being used too intensively. So, I think it was really very interesting exercise that we did. We did it just for this session this morning, just to show you what's going on out there because I don't think anybody had ever looked at how the SILCs are measuring consumer satisfaction. I think this is the first time anybody has done it. What I take away from this is there's a lot of variability, which is not bad. Variability is okay. States do differ, but there's a lot of variability and maybe we ought to at least think about how we want to approach this a little more systemically. AUDIENCE MEMBER: I think that this is an area where people are very obviously the main concern is consumer satisfaction, I would hope so, but I think there is some sensitivity as the CILs do not want to feel like they're being looked at or judged or supervised by the SILC to go back and forth kind of thing and the CILs are so important, they're so vital, the work they do is crucial. But it is tough within that back and forth because everyone wants the best but no one wants to feel like they're being judged. So how do we get around that? MICHAEL HENDRICKS: I think you raise a really good and important point. Just go back to this table we have got where remembering your role in this. Remember that second line is who ought to be planning what we are going to do here. We absolutely think it ought to be all of you. It ought to be the SILC, the DSUs and the CILs, all together, ought to be planning what could be done. I would certainly never suggest that you try to do anything without the CILs very active involvement. But I hear what you are saying and we will talk more about that. That's a very good point, the politics of it all. Here is another question. AUDIENCE MEMBER: And this might be more of a question for Tim. In states that have a state-run independent living system as North Carolina does and Arkansas does and so forth, I would assume that the requirements for consumer satisfaction for that part of IL is the same because it says the state plan must establish and a method of periodic evaluation of satisfactions who have participated in the program. So when there's a state IL program is there consumer satisfaction required, a process required of the state-run program? MICHAEL HENDRICKS: Was that actually a question you wanted Tim to try to answer right now or longer term? AUDIENCE MEMBER: I don't care, maybe later, but. MICHAEL HENDRICKS: Tim, what do you want to do. AUDIENCE MEMBER: I guess maybe it is also a question for the states of Arkansas, North Carolina, does their state-run IL systems do that? Nobody from North Carolina or Arkansas here? North Carolina? MICHAEL HENDRICKS: We will get a mic over here. Tim has written it down in keeping with his promise to write down questions and get back to us. He has written it down. AUDIENCE MEMBER: I'm sorry. The question is --is there a process of consumer satisfaction for the state-run IL system? There is, yes. MICHAEL HENDRICKS: Okay. AUDIENCE MEMBER: But I am pretty sure, if I remember correctly, that system, the evaluations are not reviewed externally, in other words they stay within VR, I am sure for evaluation. MICHAEL HENDRICKS: They stay within VR--that's interesting. AUDIENCE MEMBER: So the SILC doesn't see it? No. Part of what I think makes this interesting is that because we have consumer satisfaction required in the Part C program, the center program, the assumption seems to be that it is a centers thing to do consumer satisfaction not an IL thing if you see what I mean? Now, the interesting thing is that in North Carolina VR also has its own independent living centers. It is a duplicate program. So what's going to be, the interesting political thing that is going to happen with evaluation in North Carolina is that the VR ILs are going to be evaluated as are the centers for independent living and we have already written into the state plan that we are, have a committee to design the instruments. We have not designed the procedures yet, but they're going to come to me, to the SILC director, directly from the consumers. MICHAEL HENDRICKS: They're going to come directly from the consumers to the SILC director. AUDIENCE MEMBER: Correct. MICHAEL HENDRICKS: That's interesting, okay. AUDIENCE MEMBER: I'm the state independent living council. MICHAEL HENDRICKS: Brad? AUDIENCE MEMBER: In New York we have a partnership with the SILC and the CILs and DSU to approach the issue which includes the state-funded and the federally funded and, you know, state and federally funded, you know, the whole statewide network, and one of the ways that it is kind of handled is through the state, there's language in the state contract, you know, to help, you know address it in terms of consumer satisfaction. There's a mandate in it that says you must participate in it and it specifies that participation. MICHAEL HENDRICKS: So you actually have it in the contract. AUDIENCE MEMBER: Correct. Did you want to speak, add anything to it, or no, that's good. MICHAEL HENDRICKS: I think we have one question over here. Is this a hand up or a stretch. It's a stretch. AUDIENCE MEMBER: Could you please tell us very briefly, what exactly is a state-run IL system? North Carolina or Arkansas or anybody? AUDIENCE MEMBER: Vocational rehabilitation has its own system of independent living centers. There are 100 counties in North Carolina. VR ILs serve all 100 counties. There have 17 independent living centers VR ILs in the state of North Carolina. The centers for independent living, through our council, there are -- we have eight centers and serve 44 counties. The VR ILs are 100% state funded. The centers for independent living are federally funded. MICHAEL HENDRICKS: So it is funding coming from the state level. AUDIENCE MEMBER: But there's no conflict of interest at VR as our DSU. [laughter ] MICHAEL HENDRICKS: Okay. Let's move on then, because our bottom line on this is your state has to measure consumer satisfaction and your SILC has to be involved to some degree. Now what degree? Now what degree is it going to be? Okay. Here is what we are not going to do. We are not going to try to tell you how to do this, okay. That would be dumb on our part. Your states are all different. You have different relationships, different histories, and different needs. We are not going to tell you how to do it. But, by the way, anybody as old as I am and remember the Watergate hearings when Oliver North was testifying before Congress? Okay. No. It is going to get nerdier because do you remember the name of Oliver North's lawyer? It wasn't Watergate. It was Iran Contra-duh--Sorry. Thank you. I do have a cold. Do you remember Oliver North's lawyers name. Brendan Sullivan, I lived in Washington, so nerd. Brendan Sullivan was his name. Do you remember Brendan Sullivans, I am sure you don't. Do you remember his famous quote? Probably don't. Okay. So here is the deal. So Oliver North is sitting there testifying to Congress, being grilled, and his lawyer is sitting next to him. And his lawyer kept leaning over in his ear, Brendan Sullivan kept leaning over and saying--mumble mumble And some congressman said, you know, attorney would you please, you know let the man testify or whatever. And Brendan Sullivan comes back with this great line that I love. He says 'Congressmen, I am not a potted plant.' So I want to say we are not going to tell you how to do this, but I am not a potted plant so I have some suggestions, how is that. You don't find that as funny as I do. [ Laughter ] I thought that line I am not a potted plant, I love that line. Anyway, so we have some suggestions, I am not a potted plant. We have some suggestions. We would like to ask you to consider them seriously. How is that? They're suggestions only for you to consider and here is the question, here is the suggestion. The first one is that we think that there three key questions to think about. If I were in your shoes in a SILC, I think there are three key questions. One: why do I need consumer satisfaction data? I mean why? Second: what do I need to have in place before I can actually use consumer satisfaction data? And three: what first steps can I take to move in this direction. That's what I would be thinking about if I were in your shoes. Let's start off. Why do I maybe need consumer satisfaction data? And I will be honest with you, I can't find the answer. And neither can our team. It may be written down somewhere, but we can't find the answer to it. We can find that it's required. We can't find why it's required. It might be a good thing to spell out why it is required because presumably there's a good reason. So in the absence of that, we have asked ourselves from an evaluation point of view, why do we ever collect data. There's usually three reasons. One just to advance general knowledge. That's probably not very relevant for us. Second to provide accountability, maybe. And third to improve program performance. That's the one we think is the most relevant for us. You have heard me say that yesterday, too. Let's evaluate in order to improve our programs, okay. So we think number three is the most useful to you. So that is why we would like to see you do it. What is your state or, what do I, if I am in your shoes, what do I need to have in place before I can actually then use that improve my program, and we think there's two separate things. First of all you've got to have consumer satisfaction data that you can believe in. And second, you've got to have a process to get that data used. I am not sure that second one exists at all right now. I will be curious in the Q&A, you tell me. Is there a process in your state, in place, to get your consumer satisfaction data used and maybe there is. I would like to hear it. Each is important. We would say essential and we want to talk in detail about each one. So here is what we have for you. And this is why I think that this hand out is such a good thing for you to take. Oh-there's a question over here. Okay. We will wait. That's why this hand out is such a good thing for you to take home because we are giving a lot of suggestions in this handout of what we are calling survey basics. We have a question here. AUDIENCE MEMBER: Actually, no, I don't have a question. More like a statement that I am surprised that you couldn't come up with a reason of doing customer satisfaction. But I remember reading somewheres in the history of all of this--that centers are supposed to be consumer directed. That's not just the board of directors or the staff that direct the service, set the policies or implement the services, but wouldn't you assume that customer satisfaction means that the people who are receiving services are directing those in the position of delivering those services and setting the programs of the agency, that satisfaction from them helps you make a better decision? MICHAEL HENDRICKS: Personally I think that's wonderful and I would love to see it in writing somewhere where it just says the reason to collect consumer satisfaction data is to use it to improve services. I think that will be wonderful. I don't think it says that anywhere, yet. AUDIENCE MEMBER: It does say in Title 7 and it does say in the regs of the law that you are consumer directed. MICHAEL HENDRICKS: Right. AUDIENCE MEMBER: So you have got the various, through the standards and assurances, go through all of that, it clearly creates the path of travel of information from the customer to the board to the service provider, all of the people with disabilities So that from where it comes to where it goes and how it gets set up is the people with disabilities. When you look at the various statements, self direction, self advocacy, peer mentoring, it is all underscored there. It is redundant. I am surprised that we miss that most basic underlying tone of independent living here. MICHAEL HENDRICKS: I think what you are saying here is that it is implicit in everything that's written that of course that's why we use consumer satisfaction. Is that correct, I think? That's great. AUDIENCE MEMBER: Sure, succinctly put. MICHAEL HENDRICKS: I think that's great. I think that's great. That's why I think we should have it, but we have to have good consumer satisfaction data. So here is what we -- oh sorry, another question here. AUDIENCE MEMBER: I just -- I wanted to respond. I think that we all certainly we all agree that consumer direction is a central tenet of what we are all about, centers and SILCs but especially centers and we probably can all acknowledge that this is one way of getting at this consumer satisfaction surveying or process of assessing it is one way of addressing consumer direction within our organizations. But it is -- there are many more, and I think that Mike's point perhaps is that it is not explicit, that that is what the purpose of consumer surveying is. I think we would like it to be. I think we all sort of think that it should be, but, I think the point is that it is not spelled out. MICHAEL HENDRICKS: That is my point that I think it ought to be spelled out. Then when we have done this survey, these three states and I have looked at how the consumer satisfaction data has been analyzed and used, I think we are maybe, I think there's more we can do to use that consumer satisfaction data for improving. I guess that's my point, I mean you know me by now I will tell you what I think about something. You can disagree but I will tell you what I think and what I think is that we are not taking advantage of this consumer satisfaction data like we should. So let's spend a little bit of time here walking through some things that we would call survey basics. Now, some of this I really hope you will take this back with you because these are some suggestions I hope for the future. One of the first things we think you've got to do, very simple but varies from state to state because we have seen this even in just looking at three states is decide how you want to define satisfaction. It could just be one overall question. And it could be something like this. Overall, how satisfied are you with the services you receive from this CIL? That's one way to define satisfaction; right. Another way to do it is to say there are different dimensions of satisfaction, you know there's kind of overall but then you can ask about different components, how satisfied are you with the facility, how satisfied are you with the staff, how satisfied are you with the services, that's a completely different way to go. Or you can say something like a personal recommendation. Would you recommend this CIL to other persons in your situation. That's another way to ask about satisfaction. These are different ways to ask about satisfaction, or plans to come back again and repeat. If you could go back in time would you use this CIL again? Each of those is a little different and you have to decide in your state what question makes sense to you. Okay. Once you figure out what you are going to ask them about satisfaction, then you have to decide if you are going to ask them something else. I strongly suggest you do because just simply knowing that the people in my state are 70% satisfied, I don't think I don't know what that does for you. I'm not sure how that helps you, but if you know for example. If you know that younger people are not as satisfied as older people then maybe you can start doing something about it or if you know that women are much less satisfied than men, then maybe you can start doing something about it. But if you don't do that you will never know where to start working. So I think some demographic or service-related information, like what kind of services did you get, maybe people who got certain kinds of services are extremely satisfied and people who got other kinds of services are extremely dissatisfied. You won't be able to know that unless you ask them what kind of services they got. So I would strongly suggest to you that simply asking satisfaction is not going to give you the kind of information that will help you do something to make your services better. Whether you ask name or not, some people vary on that and then other information. Richard Petty, I thought made a really interesting suggestion. He said as long as we are sending out these consumer satisfaction surveys, why don't we use it in some way to ask about needs? Maybe you could like ask each person, is there another need that's not being filled? So it is like a needs assessment built into a consumer satisfaction survey. I thought that was an interesting possibility, just a possibility. Here is a question that you may not here is a suggestion you may not like. I think you ought to be consistent across your state. I think all CILs and agencies ought to ask the same core questions. Now one of the states we surveyed that wasn't the case, each CIL could make up their own survey and do whatever they wanted which is fine for that CIL but what does that help you as a SILC when you are looking across all of them. I think you ought to be consistent at least for part of them and use exactly the same wording for the questions and include at least one open-ended question and I would suggest something like what can we do to serve you better in the future. That's a pretty nice open-ended question. You get some nice stuff back from that. But tailor it as needed. There will be variations from one CIL to another and one state to another. If you want to let CILs add extra questions, that's fine. But after the core questions. You know, decide what the core questions are, all ask that and then let them add their extras. You have to decide whom to survey. What happens, remember back we drew up here a thousand people got services in your state. What happens if we decide to send a survey to all 1,000, what's that called? Someone said a big sample. It is actually it is not called a big sample. If you sent a survey to every single person you serve, what is that called. AUDIENCE MEMBER: It is called a population survey. MICHAEL HENDRICKS: Well, it is also call a census. In fact we have our census every ten years, that's why we call it the census, we are trying to find everybody. So you have to make a decision. One of the states we talked to in our little survey, did that. They send it to every single person who receives services. They didn't sample at all, they did a census, sent it to everybody. You have to make a decision, are you sending it to everybody or are you going to send to some of the people? And when I say you I mean you, the DSU and CILs who are making the decisions about all of this. So what kind of sample? Hopefully a random sample and if the sample, how many people are you going to sample, and how are you going to select them randomly? So there's decisions about that. Here is a really important issue, when are you going to survey people? When are you going to ask them if they're satisfied or not? Are you going ask them at the end of their receiving services? Are you going to ask them once a year? Are you going to do everybody at the same time? Or different times? You know, this is a decision you've got to make in your state. It is a conscious decision, when are you going to ask people about satisfaction. Then here is a really important one. How are you going to measure satisfaction. Are you going to send out questionnaires? How many people send out questionnaires in your state? Oh, not too many hands. How many people do phone interviews in your state? About the same number, interesting split. How many do e-mail of some sort? A few. How many do web surveys? A few, oh, more actually there. It sounds like a lot of you do a mixture of stuff. That's interesting. We have a nice mix of the ways you do it. That has to be a conscious decision as to what works best in your state. You have to decide who's going to conduct it. Are you going to do it in-house? Are you going to hire somebody outside? Who's going to send something out, the CIL, the SILC, the contractor? Who is going to analyze it? This is one of my favorites. How are you going to record the consumer answers? Are you going to have little smiley faces or 1 to 7 scales or pizzas or glasses of water? You know what are you going to do? Here is a suggestion I would strongly make to you-is pilot test the survey. Get about ten people together, before you send it out to everybody, get about ten people together and have them take the survey and learn from them what they're saying. Here are three ways to do it may I point out. This first one is one that people almost never do. As you've asked these ten people to fill out the survey, have them talk out loud to you what they're thinking, okay now, I am opening the page and turning it and I am seeing number one and I am reading it and seeing I think what it is about. Your answers are dut dut dut… Okay I am probably going to mark a three because of dut dut dut... Now I will go to the second question. Have them talk out loud, have you ever done that? It is a really nice trick. Have them talk out loud as they are going through the survey. You will be amazed what you learn from that. They are free-associating about it. You get an insight into their reactions to it. The second one is of course once they finish the survey talk to them there about it, debrief them. And a third possibility is to do focus groups with the ten of them. Remember, I said yesterday don't do focus groups for outcomes, but this is a great time to use focus groups, to get them together. So there are three very nice ways to pilot test your surveys or suggest to your CILs that they pilot test their surveys. Tim has stepped out. This is the moment I need to do the satisfaction survey with you because from now on you will hate me. This is the moment you are not going to like at all what I am about to say. Because you need to maximize the response rate. I suspect we will be chatting about this a little bit. This is why we did our little exercise here of remembering what the response rate is. The percentage of people who send back whatever you sent to them or who are willing to talk to you on the phone. I am and again, we are friends by now, so I can speak honestly to you. Your response rates are way too low, way too low. It is really important to up that response rate if you are going to have any confidence whatsoever in what you are learning. 13% and 16% are not acceptable. Now, it is not just me talking. I was telling Darrell a couple of days ago I actually went to Google. I was sitting here and I did Google. You can do this too, Google adequate response rate for surveys, okay. And the first five ones that came back I looked at them, they were articles I looked at them to see what they were. The first one said 50 to 80% is good. The second one said 20 to 30% is not acceptable. The third one didn't have numbers. The fourth one said 60% is acceptable. The fifth one said 50 to 60% is adequate. Not a good trend, eh. So then I went to the OMB web site and I looked to OMB and I was trying to figure out what they said was an adequate response rate. And they don't really come out with a number but they say if it is below 70, isn't that scary, if it is below 70% you need to do a special nonresponse bias analysis because they're nervous about it below 70%. Now I am not comfortable saying 70%. I'm very comfortable saying 50%. I have talked to colleagues of mine in the evaluation field who know more about this than I do. In fact, I once surveyed four or five of them and asked them. They pretty much came up with 50% or throw it in the trash can which I know is not what you want to hear. AUDIENCE MEMBER: When you surveyed these surveyors, what was your response rate? MICHAEL HENDRICKS: It was actually a census of some people, thank you for catching me on that. AUDIENCE MEMBER: You didn't answer me. MICHAEL HENDRICKS: I had 100% on the census. AUDIENCE MEMBER: I had 100% on the census. That's acceptable. MICHAEL HENDRICKS: That's acceptable. Really. Now I know the implication of this news, the implications of this is that none of you are collecting, I think, none of you are collecting consumer satisfaction data that you should be trusting. That's a really scary, awful thing to say. But I am just, you know, my job is to lay stuff out here for you to think about. AUDIENCE MEMBER: Based on the three examples we see here it seems as though telephone surveys got the highest response rate. MICHAEL HENDRICKS: Absolutely did. AUDIENCE MEMBER: And like back in the day-old school, pick up the phone and make a call. MICHAEL HENDRICKS: I am not going to say that, I am not going to say you should do telephone surveys instead of mailed. I am going to say that you need to get a 50% response rate or better. AUDIENCE MEMBER: People that get who send out surveys and get a low response rate, how do you follow up to increase the response rate? MICHAEL HENDRICKS: Well, we will talk about that actually. There's a number of things you can do. In fact people make whole careers out of just this question. How do you get up to that 50%. We are going to talk about that and you have some tips in your handout there But I would really like, first if we could, who's willing to tell what the response rate is in their state? Anybody willing to say? It would be nice to just do a little feedback here, anybody know even for instance what is the response rate in your state? Bob? Just tell me and then I will say it. Let's wait and get the mic there. Bob says it varies in Arizona. AUDIENCE MEMBER: Actually I am thinking of Pennsylvania too. MICHAEL HENDRICKS: Okay. AUDIENCE MEMBER: In both states it varied a lot. You had some centers that are very small and then you had others where they would get a very good rate because they followed it up with calls. They sent a survey and then they called people as well until they got it up. So because it is center by center rather than statewide, it varied. MICHAEL HENDRICKS: So each center did something a little different and the ones that put more effort into it got a higher response rate. Not too surprising. AUDIENCE MEMBER: Ours is fairly high, but we do it on the phone through a web survey and we require a 20% sample of the entire population of the CIL and the rate most of the centers are completing the 20% . Sample. MICHAEL HENDRICKS: This is Missouri. AUDIENCE MEMBER: Yes. MICHAEL HENDRICKS: So what you are saying in Missouri of the whole population you take 20% of them. And then you call them up. AUDIENCE MEMBER: Yes or do it in person. MICHAEL HENDRICKS: Or do it in person. AUDIENCE MEMBER: Then it is entered through a web based product so it is not (inaudible). MICHAEL HENDRICKS: You get a pretty high response rate. Do you have any sense of what the response rate is? AUDIENCE MEMBER: I could probably look it up, but I know it is probably pretty high, because the centers are (inaudible) we do it directly. MICHAEL HENDRICKS: Okay. There's one back there and here and here and several, okay. Do we have a mic in back? Oh, I am sorry, right here. Hiding behind Bob. AUDIENCE MEMBER: Mike, as a center director, I have found that our overall services survey is much lower response rate than if I have to survey our Money Follows the Person consumers or our HAVA consumers. I get, we got at least 50% on Money Follows the Person and we get close to 98% on the HAVA program. MICHAEL HENDRICKS: How do you get those high response rates? AUDIENCE MEMBER: We sent out surveys. MICHAEL HENDRICKS: You actually mail stuff out and you still get 98% response rate? AUDIENCE MEMBER: I think. MICHAEL HENDRICKS: Great. AUDIENCE MEMBER: We pay their rent. MICHAEL HENDRICKS: Ahh, so they feel an obligation. AUDIENCE MEMBER: I really do. I think that skews it somehow because we do get an excellent return rate. And it has always been positive, so I am not saying we are not doing something right, but I think it is different than when you are saying peer counseling and I&R, it is not as tangible. MICHAEL HENDRICKS: Okay, that's a good point. In the far back we have one. AUDIENCE MEMBER: Hey, Mike. In Massachusetts, what we ended up with, we have been using the same survey now going back like 11 years is the last time we redid it. And so you worked with us a little bit earlier this year because we are switching to outcomes but our response rate the past couple of years has been about 27 to 30%. That's doing a paper survey that the centers are requesting. I think the frustration that this frustration level is that centers have come back and said we think these questions are junk why are we still asking these questions we have been asking for ten years. MICHAEL HENDRICKS: May I suggest that's exactly why we are doing the session this morning. I'll bet a lot of states are kind of just doing the same process they have been doing for a while. This session is to get you maybe thinking anew, how is that, thinking anew about consumer satisfaction and how valuable it could be and maybe we need to do a little bit different perhaps, who knows. AUDIENCE MEMBER: Absolutely. I think that's the best way to look at it really. MICHAEL HENDRICKS: Okay. Good. Here we go in the back here. AUDIENCE MEMBER: This is Roberta from Minnesota. I guess one of the questions I have for the SILCs who do the customer satisfaction surveys is, are there data privacy issues? Because I know that we have talked about that before and we would get pushed back from the centers who are each doing their own and each doing it in a different way. I am just wondering if there is data privacy issues for getting names and numbers of people. MICHAEL HENDRICKS: Let me ask, is there a SILC that feels they can respond something usefully to that in the back? AUDIENCE MEMBER: The regs are in Part 364, I believe, indicate that all of the information identifiable to consumers has to be held confidentially by the CIL. Each CIL or each service provider should have a privacy policy and you give notice to the consumers. And they are allowed to do things like say we have a business agreement with our SILC to help us do satisfaction surveys in that privacy policy and privacy notice. And if they do something like that then they could share that information with the SILCs or consultants or whoever. I have a question. MICHAEL HENDRICKS: Please. AUDIENCE MEMBER: For Mike. Can they share that data if they clean the names out. Can they share the information without the names. I think what the regs say is that any identification, any data that's identifiable to a consumer has to be confidential. Yeah, okay. So they can share the answers to the questions as long as you have no way of knowing who made those answers is that about right? Yes. I this I so. Thank you. MICHAEL HENDRICKS: That would be great. I am not at all urging that you attach people's names to all of this. I am just saying because I mentioned earlier if you care if women are more satisfied than men, then you need to know if it is a woman or a man or who it is. AUDIENCE MEMBER: I understand that piece of it, but if the SILC is doing the survey, then the SILC is getting the names and phone numbers and demographics of each consumer for each separate CIL. That's where my question was, but Mike answered it. Thank you. MICHAEL HENDRICKS: Okay. Good. Excellent. AUDIENCE MEMBER: I also have a question. Would you describe the hopper survey that was mentioned? MICHAEL HENDRICKS: The hopper survey. Oh, sounds like that was something Bob Michaels mentioned. No, sorry. It was a lady. Pat did--hang on one second, she will. We will let her do that. Sorry. AUDIENCE MEMBER: It is actually Housing Opportunities for People with HIV and AIDS. Those consumers get surveyed by both us and by the state of Connecticut and that one comes out really high. But we are basically subsidizing rentals subsidies for those folks in a transitional program. So the tangibility of it makes it a little bit different. MICHAEL HENDRICKS: So it is a survey to a specialized group of people. AUDIENCE MEMBER: yes. MICHAEL HENDRICKS: That's why we think that they're more willing to respond. AUDIENCE MEMBER: Yes. Can I also say while I have got the mic, the other issue around confidentiality of consumer information is when the SILC or any outside entity wants to survey your consumers and they want your mailing list. That's where we get into difficulty from a CIL perspective. What we have typically done in the past is we, they give us the survey, we mail it out. And it comes back anonymous. So we are not releasing anybody's information. MICHAEL HENDRICKS: I am glad you said that because that's one things I found in the little survey of three states is that there are some interesting and useful ways, you can do it of, somebody creates it, somebody else mails it, somebody else gets it back. There are some interesting. You can solve some of these problems just by being creative about stuff like that. AUDIENCE MEMBER: We need great diligence and perseverance to increase these response rates because many of you know that surveys I have done lots. People some people don't like to answer the phone, don't like to respond. You have to be very persevering. MICHAEL HENDRICKS: Well, that's a perfect lead in to what we are going to do next, but let's see what Pat has to say first. AUDIENCE MEMBER: Watch this, you've got three mics. Part of what I am sitting here wondering is if you actually run a fairly small center as some of our centers are, they may serve in terms of consumer service records 125 people a year. And you and your staff are in touch with those 125 people on a pretty regular basis, there are gatherings at the center, there's planning sessions and there's review sessions and there's the company people places and all of that. And if you have almost an intimate relationship with all of the people that you serve because you're small, this would seem to me like taking a cannon to a fly swatter, you know what I mean to do something this complex or potentially complex when you're fairly close to people anyway and you can get, you've develop a trusting relationship and you can get feedback in a much less formalized way about how you can improve things. In fact, engage the consumers in how you can improve things because part of what independent living is about is taking responsibility. So it is not about us as the service delivery agency fixing things for you, consumer, and that's part of what, you know some of the tone of the questions are how can we make it better for you. Well, we are all in this together and how can we make this better together. MICHAEL HENDRICKS: It is a good distinction. I like that. AUDIENCE MEMBER: So, I just, the whole notion of a consumer satisfaction survey is, unless you have a tone in there of how are you getting involved to make the world around you better, then we leave that notion that we are a service provider handing over something to people and doing for people. So and that -- I don't know how we get that, in fact that may be why we have this requirement of doing consumer satisfaction because it was directly plopped out of another law and stuck in here and it doesn't make any sense. MICHAEL HENDRICKS: I don't know the history of it, so maybe. AUDIENCE MEMBER: It wouldn't surprise me at all because a lot of the regulations that are in Title 7 are plopped from some place else, I think, which is why we need to rewrite them. But I just think that people need to use some common sense with how to gain feedback and engage people in making the services better rather than us fixing it for you. MICHAEL HENDRICKS: I think that you raised two really excellent points. One of them is this one of how you define satisfaction. And you are right. The way it is worded here is, um well, I guess we didn't actually word one but there is an implication here of how can we do something better for you as opposed to how can we all do this together. That's a really excellent point you raised and the other excellent point I thought you raised is that maybe actually some kind of a formalized way of gathering information isn't necessary for every situation. And there I would just go back to this where, you know, oh where was it, somewhere in here, there it is, we are not going to try to tell you how you should do this, because we do recognize that there are differences. I think that both of those are excellent points. So it is really food for thought, see if it fits for your situation. One here? AUDIENCE MEMBER: I think that that is an excellent way to find out information, but it doesn't provide you with information that can be compared across your consumers. You cannot, it depends so much then on what questions you happen to casually ask. You cannot say for instance that the people all said they were satisfied, very satisfied. It is not that we are, it is not that I am married to a particular format with answers, but you cannot take that as a whole and use it as effectively as you can if they are being ask the same questions. MICHAEL HENDRICKS: And remember ... AUDIENCE MEMBER: I do think there's a real important part to be played by that piece that is being placed in there. It is not just that we have always done that. But there are very important roles for that kind of a survey. MICHAEL HENDRICKS: You remember too, who we are here, the state level here, we are the SILC we are supposed to look across all of the CILs in the state and, you know like you say consistency of what's being asked at the CILs surely helps us to do that. So there's a very valid point there. This is obviously an interesting and relevant topic to all of us. AUDIENCE MEMBER: This is Sheryl from Hawaii. I think that clearly what Pat said and I don't know who that was next, but. Sorry, Mary from Nebraska. Okay, Mary. Clearly what both of you just said, illustrates to me that in different places how you get the information is going to be very different and how it is useful is different. Because in Hawaii you will never get a 50%. I can almost guarantee you might get a 40% rate. But never a 50% and beyond I doubt. But possibly the way you are going to get the information in Hawaii is by talking to people because if you don't ask about the family and the whatever first you will never get down to business. So talking story in Hawaii is really clearly the way that you are going to get more information than any other way. And, but I see a need for both. I see a need for having clear data that is, you know, that is somehow written and analyzed and I see a need for being able to gather people's stories if you will about how services have affected them and I also see what Pat is saying about, and what can you do to help make it better. MICHAEL HENDRICKS: I like the way you put that. There's different cultures in different states and you have to fit into it and find out what it is. AUDIENCE MEMBER: If I may, I still say, absolutely, whatever format you want to use, whatever format is going to be able to get you a better response rate, however, if you are not asking the consumers the same questions you haven't got, and I'm a nerd you don't have things to statistically count on. So how is the family, you know, is Joe going to college this week? Is there anything about the center that you would improve? How satisfied are you with the centers services, and if Joe goes college what classes is he going to take? You still have to have consistent data being collected. And I am calling it data because that's what it is, whether it is Joe is going to college and going to BYU, that's still data. It is just in the language of data collection rather than in the language of story telling. It doesn't, you know it doesn't matter, you still have to have some of the same stories. MICHAEL HENDRICKS: I think what I hear you saying, tell me if I am wrong, is that even though you may need to do different introductions, do different warm ups, reach people and get people motivated in different ways at some point you are saying we ought to decide what we need to learn from consumers and find a way to get that somewhat consistently across all of the consumers so we can do some kind of a… See there's the part I don't see written down, so we can do some kinds of analysis that lets us see ways to make our services even better. I think that is what I hear you saying. AUDIENCE MEMBER: It is wonderful to have a lot of anecdotal data in there, that's very very valuable, but if you can't analyze it in a way to help you move forward, that's my nerd answer. MICHAEL HENDRICKS: All right. Well, fortunately, someone said how in the world would you get to 50%. We have some ideas but I will suggest we are awfully close to break time and this is a logical break. So why don't we take our break. When we come back we will present our suggestions to you for how to get up to 50%. Fair Enough?