MICHAEL HENDRICKS: I was talking to this fellow during the break and he said I don't exactly have the answers for this but I think this is a really important session. And I said why. He said to be honest with you, this is the wobbliest leg of the stool. Which I thought was really interesting. He said you know you are talking to us about a three legged stool. He said I can tell you from the inside that we kind of have just let this ride and we are kind of just doing what we have always done and it has been a long time since we really thought about this and it is good to sort of you know revisit it. So he said I think it is the wobbliest leg of the stool. I thought that was a really interesting way to put it. So Bob says I will give you a stool sample. mmm nooo [ laughter ] AUDIENCE MEMBER: A sample, Bob. MICHAEL HENDRICKS: But I can say -- I can see what your scores are going to be. But what we are going to try to give you. How long have you been holding that? That's what I am thinking. Here is what we are going do is give some suggestions for getting that response rate up to 50%, okay. Now, they're all in your handout. You can take them back with you and mull them over, but here is the thing to know is there are people who write books about this. There are people who give workshops on this. There are people who have videos on this. This is information you can get. And I really urge you to. We have references in back for you. So this is stuff that you really need to pay attention to, I think, not necessarily me but the topic. Here is the first suggestion I would make to you. AUDIENCE MEMBER: Could everybody listen a second here. MICHAEL HENDRICKS: I think if you are going to do anything, this is the one thing I would suggest you do, consider surveying fewer consumers but working harder to get each one to respond. That's what every expert in the field will tell you, you do not have to send out 10,000 questionnaires. That's a waste of effort. Send out many fewer but work harder to get those returned to you. Here is an example. We all know the Nielsen ratings, right, the television ratings, billions and billions of dollars depend on the Nielsen ratings because that's what advertisers use to know how much it should cost to put an advertisement in a certain show, you know who's watching that show. We have 300 million Americans; is that correct, something like that. Do you know how many people they sample for the Nielsen ratings? Anybody know? How many? Last I heard it was 1500. Now I don't have a calculator but 1500 out of 300 million is a small sample. But they get those people to respond. And they slice and dice what they get back. So they use it and they analyze it. So you do not have to send to everybody. That would be my major suggestion. Send out less, work harder to get back. Now, how do you get it back, what are some of the ways to get something back? These are tried and true, and there are people who do research on this, so they will send out one survey this way and another survey a different way, so they test this. This is not just people's opinion. This is actual research that has been shown to be effective. Send a letter in advance, use a personal salutation on the letter. Send the survey with a transmittal letter on legitimate authority's official letterhead. Again use a personal salutation. Tell why the survey is being done. Let people know why you are doing this. Who is doing it and how the information will be used. Thank the person in advance. The survey itself should be short and concise, easy to complete. Use color if possible. Stress the value of the survey to improve services. Again these are not just my opinions or other people's opinions, they have tested these by sending out some surveys that do it this way and others that don't, and they compare the response rate. So they have, they empirically have data. My nerd friend and I we trust the data. Because they have actually tested these. Include a plea. This is an interesting one, it really seems to help. It would really help us out if you would return this survey. End with a real signature, not just a written, you know thing, not a typed thing but a real signature on the letter. Include a self-addressed stamped envelope with a real stamp. Send a reminder postcard after one week. Just a postcard after one week. Second a second survey after two weeks. If you are calling, call on different days at different times. And someone said here that they send out a survey but they call if they haven't responded. That's another of these good ones. Consider calling those who don't return written surveys. And there's coding ways to know who hasn't responded yet. So those are some suggestions for maximizing the response rate. Yeah, Pat? Maybe you have others, I'll be curious what else you've got. AUDIENCE MEMBER: One of the things I wonder about is do rewards help? Like for example if your survey is returned by x date, your name will be entered into a $20 Wal-Mart certificate give away or whatever because things like that, you know, does that help, do we know? MICHAEL HENDRICKS: I don't know, but there are people who do know, and some of these references I have given you. They have certainly done studies on that. We had to pick and choose what we put on our list here for you. We left that off, but I could go back if you want and do some more research on that. But I know that's done, and there are people who know the answer to how effective that is. AUDIENCE MEMBER: This is Bobby. It is effective. I have done surveys with other groups and when you advertise you come take this questionnaire or survey you get a $5 or $10 gift card from Giant or Safeway, it is a line outside the door. It really is. MICHAEL HENDRICKS: There's two things here going on, lets be clear. One of them is if you fill out this survey you will definitely get something, that's what you are kind of saying, yeah. And there is this other possibility if you fill out this survey your name will be put in a pool at the chance of getting something. I am sure there is research on each one of them. But you are saying one definitely works. AUDIENCE MEMBER: Just curious, are there any theories as to why an actual stamp works better than a meter? MICHAEL HENDRICKS: You know I am not one of those people who has made a career out of this but it seems to work. They have research that says it works. AUDIENCE MEMBER: My guess is it is a lot more personal you know somebody put that stamp on there compared to running it through a stamp machine. MICHAEL HENDRICKS: It makes sense if that's the dynamic that works, who cares, let's just take advantage of it. Back here. AUDIENCE MEMBER: This is David Sharp from the Maryland SILC. When you start offering things to people to return a survey, you are going to get a skewed and biased input because they're going to feel obligated to say something nice or better about what they have got because they got a prize. So, it is going to be skewed in that sense. MICHAEL HENDRICKS: So you're concerned that you can get them to respond but will it be unbiased answer that they give you. I am sure there are people who have tested that, I don't know exactly. A good thing to think about. AUDIENCE MEMBER: Well, David, I was going to say, I was going to agree with Bobby and say the exact same thing I have worked for organizations that do that. It is not really biased because what you can say everyone is getting one of these, and you even put in there to buy things like shaving cream and x and y that food stamps just simply don't pay for. You know it shows a level of consideration for us and like we understand what it is to be like you know, and whatever. It is also like because something like because you are taking your time, we wanted to, you know reward you, it would be biased if it was like, you know, letters a through m are receiving these, you know what I mean? And then we can also say they're going to come back to us anonymous, you know. So, I think that will be, you know, there's no punitive action if you are not satisfied with something. MICHAEL HENDRICKS: Is there another. AUDIENCE MEMBER: Mike how do we determine the best method to survey for our particular state? Every single one of us is unique, and then as SILCs, what do we do with that information once we get it? Because I've had a lot SILCs ask me that question. Okay we did this survey, we tallied it, now what do we do with this stuff? It depends is a good answer. MICHAEL HENDRICKS: It sounds Steven like you are opening that up to everybody. I hope you are opening that up to everybody. AUDIENCE MEMBER: Sure. MICHAEL HENDRICKS: So there's a two-part question there. One part, what process do we go through in our state to figure out whats the best way to do consumer surveys satisfaction. Is that first question? AUDIENCE MEMBER: Yes. MICHAEL HENDRICKS: The second one is what should we and do you mean by we, all of us, the SILC, the CIL, the DSU all of us together or just the SILC. AUDIENCE MEMBER: The SILCs, what's the role of the SILC once we get that information. MICHAEL HENDRICKS: Let's go back to that. The second one I can address and the first one is going to vary so much by state, let's see what people say. Remember we say here that compiling and analyzing the needed information we are not sure the role of the SILC in that. There may be some SILCs where you should actually be the group for some good reasons that actually gets it, that actually runs the analysis perhaps, there may be others where no--you are going to hire a contractor to do that part. So in term of the actual, the analyzing of it. But then what to do with it afterwards, discussing and interpreting the findings, I think that's absolutely the part, the SILC needs to… Remember the coordinate, you are coordinating stuff, the SILC absolutely needs to pull people together to discuss and interpret the findings, decide what steps to take. Again my only fear about that is what I said earlier is that if all we collect is an overall satisfaction rate, 70% of people are satisfied or 95% of people are satisfied, that's not useful. You can't take that back and do anything with it to make your services better. You've got to somehow have some differentiation in there. You're got to know that well, the women are more satisfied than the men or the older, the younger, different disabilities, otherwise there's nothing for you to do. So that is a concern I have, Steve. So that first question about process in each state, it sounds like some other people want to comment on it. So we will start right over here. AUDIENCE MEMBER: Well, I suspect if you''e able to look at that data and look at the various trends and patterns as it comes with your next development of the state plan, you can certainly identify ways that Part B money can be used and ways of supporting the network of centers in the area with technical assistance, developing various trainings and developing the core competencies that your center needs in the areas where there might be a lack of customer satisfaction. MICHAEL HENDRICKS: I agree with that completely. So long as you're -- so long as your consumer satisfaction tells you which core competencies were they very satisfied with, which were they not, you need the to know that difference; right? Okay. AUDIENCE MEMBER: One of the things that we did is it several of our centers do home modification programs, and so it was always a source of dissatisfaction for consumers and it took a long time to get their home mod program or their home modification done and it was always reported. And of course it took a long time because there were long waiting lists and not enough money and same old story. So we used that data to go to our state department of housing and get a state line item strictly for home modification funding. So that's an example of how consumer dissatisfaction was used to help get some money in the budget. Now, it was great, and then like within two years we lost all of those state funds because of the state budget. MICHAEL HENDRICKS: But still, the story you told is great. You knew, you had somehow collected satisfaction not just overall but from people who got different kinds of services; is that correct? And so you were able to say the people who had gotten this kind of service were much less satisfied than everybody else. So then you were able to look into that and figure out why it was and do something about it. That's what I mean about getting it at a level we can do something with. I am not seeing a lot of that going on, but I am hoping I am wrong. I hope I am wrong. There's one thing you said there that I wanted to point out if I could. I said yesterday that the data do not speak for themselves and you just gave an excellent example of that. The data do not speak for themselves, you really do have to discuss and interpret the findings which is what you did. You discussed why are these people so dissatisfied and you interpreted that and did something with it. That's a wonderful example, I think. Someone in the back there. AUDIENCE MEMBER: Good morning. This is Ron from Wisconsin. How many surveys have you gotten in the U.S. mail and vice versa. One of the things is I have a problem with and I have filled out a lot of surveys something as simple as possible right to the question and as well as the printing. Sometimes it is hard, some of the questions are hard to read. I think the communication-wise on a survey it should be easy to read, easy to work with and direct to the point. Like I say you give more detail in the survey you will lose a person right off the get go. So if you can get a very basic question, what's the problem, as well as easy. One thing too, I don't know if this was brought up or not. But also when you send out a survey SASE, self addressed stamped envelope. Because some people will say heck with it, I got it and I am going to put it in file 13. MICHAEL HENDRICKS: Self addressed stamped envelope part, right there. AUDIENCE MEMBER: That's one thing you have to do to make it simple, easy and something that says hey I don't want to do the yes or no but something that is easy as possible because some people look at it and to heck with it because if it is too detailed who's going to fill it out. MICHAEL HENDIRKCS: I agree with you. That's an excellent point. And that's why this very second point we make. You know the first one is send out fewer surveys, but work harder to get them back. But the very second one is send a letter in advance to let them know it is coming. I think what I heard you say was we get so many surveys we are bombarded by. If we get a letter that says this one is going to be really relevant to me, I will get something really relevant to me, and they have asked me to help because it will be useful and you know it is coming, then you can separate it out from the other hundred that come or whatever. AUDIENCE MEMBER: I have a question, Mike. MICHAEL HENDRICKS: We have one over here first, please. AUDIENCE MEMBER: All right. I will give you a VR perspective for just a minute because VR does consumer satisfaction surveys as well. MICHAEL HENDRICKS: Terrific. AUDIENCE MEMBER: We do them with our state rehabilitation council, and we have been struggling with this for maybe ten years, and we tried sending the surveys out sooner, we tried all of the different approaches that we are using, we definitely used the stamped self addressed envelopes and we finally came to the place where we made phone calls and a quick economic analysis is if you send out 100 surveys, just the postage alone for those 100 with self addressed stamped envelopes is going to cost you $90, not counting the two envelopes the surveys and the people's time. And for $90, I can get somebody to make phone calls and get twice as many responses from that same amount of money. So I think it is cost efficient and much more effective just to use phone surveys. MICHAEL HENDRICKS: Well, I for one am really appreciative to hear from people who have been wrestling with it actively trying different things, you are coming back with some practical advice on what you found works for you. As I said I was very reluctant to tell you what you ought to do. It is not my place at all but it is a coincidence or maybe it is not a coindidence that the state we surveyed that does telephone interviews and Missouri also is talking to us, they do telephone interviews and get high response rates it sounds like. That's also meshing with your experience. Again I am not saying we should all do telephone surveys. But I am saying maybe we should think about it. Based on what you just said too. There's a bunch around. Go ahead. AUDIENCE MEMBER: This is Sheryl. I think also in the future, what you want to do with the data for the future we have to be careful in how we are asking, I don't have necessarily the correct, the correct way to ask the question in my head right now or the questions about future services or how you can help us better the centers or the service delivery. But if you ask it and people perceive it as needs assessment for example, as a person with a life long disability, as soon as I hear needs assessment I start running the other way because as a person with a life-long disability we have been overassessed and over this and folded and whatever. And over, the needs haven't changed that much. We all know that. And so I think that we have to really be careful how we present that how you can help us in the future provide better services part. MICHAEL HENDRICKS: Okay. That's good feedback. Yeah. AUDIENCE MEMBER: I want to take off my SILC hat and put on my CIL hat. As a CIL director I have four different contracts and all four contracts require a consumer satisfaction survey that potentially I have consumers that are on all four contracts. So my consumers are like why every three weeks am I getting another survey in the mail, didn't I just answer this. You know, can't you freaking figure it out? And that was the literal question I got from a consumer about a couple of weeks ago, can't you just freaking figure it out from the first one. What's the deal here? MICHAEL HENDRICKS: We were actually talking about this at the break and we came up with a couple of possibilities about this exact same thing. One is to try to combine the surveys in some way for the people who are relevant and then so they at least just get one survey even if it is a little longer but another one that I have heard after and I will throw this idea out and you can see if you like it or not. Let's say that you were able to make a list of all of the people you're serving who are likely to get three or more surveys something like that, okay. And so let's say that it is a 500 people. I am just making up a number obviously. Randomly, divide that into four different groups of 125 each: You've got to do it randomly, so you really then you have little sub pools of 125 each. Then instead of sending them all four, send each one just one. So they just get one, one, one. You don't have any bias introduced into it, because everybody has been selected randomly. I have created these little subpools, but everybody is still just getting one survey back: You might think of that. AUDIENCE MEMBER: So it is essentially doing what you were talking about earlier using a smaller sampling rate to get the same number? MICHAEL HENDRICKS: A little variation on that, but it is taking your overall pool of people and making it into sub pools. You may not, may I say I will just go on a bit. You may then only send out 25 in each one. I am not suggesting that you have to do all 125. And you will pick them randomly too out of the 125. But it is a way of avoiding somebody up here getting all four of these, okay. So that is something we came up with at the break as a possible solution. Or a way around that. Who's next. Let me do a couple of more things and then it is your turn remember? Your turn after the break. Let's get to your turn. So anyway, a bunch of suggestions, not all of them by any means, but second part, remember I said you have to have two things. You have got to have data you can believe in and then you have to have a process to get the data used. I keep saying used, don't I? Because what good does it do to collect this data if we are not using it. So here are some thoughts about a process to get it used: One is to not do it yourself, involve all IL partners from the very beginning, make it very collaborative. Do all planning collaboratively. And this next point I am going to turn to Bob Michaels to say something about it. Because he mentioned to me at the break, you are going to talk about the "gotcha problem," aren't you. I said we are but if we don't talk about it enough, you talk about it more. So you have to really find a way in your state to avoid the CIL competition or the "gotcha." Remember what I am saying is you need to know who's more satisfied and who's not satisfied so you can do something about it. What if part of that is knowing which CILs are leaving people more satisfied than other CILs? Oh my gosh, what a political mine field that is; right. Because it is not your job to oversee the CILs, we all know that. But you are the state, you know, so if you see. I will take Ohio. If you see that the southern part of the state is less satisfied with services than the northern part of the state, you are not supposed to ignore that and yet at the same time you are not supposed to oversee the CILs. That is tricky. Let's have Bob give us some thoughts and advice hopefully about that. AUDIENCE MEMBER: You know, I bought this up to Mike at break there and said are you going to talk about the 800-pound gorilla that's here, the one where the centers don't want to be evaluated by the SILC and they're afraid of doing that. Historically you need to understand it was a real environment back in the 1992 when they put SILCs in charge like this. And they put this language in the law that they're really concerned that SILCs were going to try to -- and even to the state, I hear it every once in a while there will be somebody now we have done pretty good. We have gotten on to, gotten people onto SILCs and in charge of them that understand that no it is not the responsibility to monitor the centers. But on the other hand, you will have some center that isn't doing very well and somebody will get on the SILC and they see this as an opportunity to use the information and to get like this to compare. Well, obviously they're not as good as the other center. You can see they're not. The danger there to me is important, but you are right we have to do something to compare. What do you do? I don't know. You will say well, we have got to do some sort of a thing. And I will tell you as a SILC member it is hard to compare people you can say you look at each one of them and say they seem happy that they're identifying problems and you look at a individual CIL doing that. But not all CILs do that and not all CILs are run like that. So I don't know what the answer is. I mean it is almost like you need to have some way to compare but how do you build the protection and make sure it is not a dummy on the SILC making sure they aren't using it the way they shouldn't. MICHAEL HENDRICKS: Let me say if anybody in the room feels that in their state they have found a good way to do CIL by CIL comparisons without causing political problems that people are able to understand why you are doing it and it is for improvement, please tell us how that is working in your state. Pat I am not sure for that's what you were going to say but, I would like to hear that. AUDIENCE MEMBER: It was not what I was going to say. But even though the SILC is the repository according to the regulations, the SILC is the repository of information for consumer satisfaction, however what the regulations say is that the state plan shall blah, blah blah. And so one of the things that I was wondering as you were asking your question, Bob, what would prohibit the state plan from disseminating the results of satisfaction surveys to the boards of directors of all of the CILs. I mean if that was the process you had outlined in your state plan, that the DSU sent consumer satisfaction results to the boards of directors of the centers. The DSU has the responsibility to monitor the centers if they have any skin, any money in the game. And so why wouldn't that be a reasonable approach? It is not the SILC sending out consumer satisfaction information, it is the monitoring agency either the DSU or whoever else is doing the monitoring. What about RSA, that's the question I keep wondering about too. What's the role of RSA in monitoring centers and what do they do with the information that they get about consumer satisfaction? MICHAEL HENDRICKS: Others. AUDIENCE MEMBER: Do we really want the centers at RSA and DSUs to have that role. Some centers would say that's even worse than having the SILC have it. AUDIENCE MEMBER: This is Ann from West Virginia. Just to respond to what Bob just said, we may not like it, we may not like the way they monitor, but it is their responsibility to monitor. But back to Mike's question, in West Virginia, we do a collaborative process, everybody gets the full report but there's no comparison among centers in the full report and then each center gets their own response data and the goal there is for the center themselves to figure out how to make things better instead of somebody else saying well you're not as good as the guys up north, what are you doing wrong and you need to be doing better, it is kind of self-improvement. We send out the surveys and we have a consultant that receives and compiles and reports. We don't see the individual data. We see the report that she compiles. We don't get the individual data by center, only that center gets that data. MICHAEL HENDRICKS: That's very interesting and a lot like what Bob was talking about yesterday afternoon about the CIL outcome measures project we are working on. Where we got information from 22 CILs and everybody got the overall picture but each CIL privately and only to them also saw just how they were doing compared to the other 21. And they, as Bob said yesterday they love that, first time they had gotten anything like that. So maybe that's a model. So maybe what you are saying is, Ann if I am not mistaken, maybe what you are saying is, maybe it is not the SILC's job to follow through and make sure that a CIL improves but it is perhaps the SILC's job to make sure that the CIL has the data it needs to improve if it wants to improve. Is that -- I am just brainstorming. We are all brainstorming here. Interesting, interesting model. Just a show of hands, how many states do something close to that where a CIL, if I am a CIL in your state, I get information about how satisfied my people are compared to people from other CILs around the state. No hands went up. Interesting. I am just brainstorming here. There was one here. AUDIENCE MEMBER: I have to discuss the gotcha thing. One is the satisfaction surveys, what we are trying to do in Maryland is setting up collaboration by having IL, the CIL director, the SILC officers and representatives from the DSU and that is making people feel comfortable talking about the need that they perceive that they have consistent, time to set up a comfortable situation where we can share with each other and collaborate so when you get down to sharing these, these consumer satisfaction surveys it will be comfortable climate to do so. MICHAEL HENDRICKS: I like what you are saying there, you are saying collaboratively, you are trying to build a culture of trust among each other familiarity and also a culture of improvement. That's why we are here, continuously improve. I like a lot of what you just said. We have one up here. We need a mic. AUDIENCE MEMBER: We are all distracted from Missouri because our flights are all canceled so we are trying to figure out what is going on. Our hands should have gone up when we talked about the CIL thing. MICHAEL HENDRICKS: A little louder, please. AUDIENCE MEMBER: One thing that's unique I have never been told that before that I need to be louder. Here, I will make you feel good. MICHAEL HENDRICKS: We want to hear what you say will you please speak much louder. AUDIENCE MEMBER: You heard that, Kathy. But one of the things that RSA when they reviewed the SILC and things and they thought it was very unique that we do, I think you are talking about us when you talk about the 22 because the SILC actually it is called IL outcomes and we all do the exact same survey. Chris got it on the Survey Monkey this year and that what you were talking about 20%, 100% of the CILs actually participate and it is all the same information so that we can use that with our reporting as well as our IL summit that we do bi-annually. We use that information from that to what the CILs want to have additional training on. MICHAEL HENDRICKS: Okay. Let me ask you a question about that. So I think I heard you say that each CIL gets their own private information, yes. uh-huh. Do other CILs know that at all? If I am in CIL A, will I know you how well CIL B has done. AUDIENCE MEMBER: No, the other CILs cannot see what each CIL does but we compile all 22 CILs information together in one report and then we send out individual reports to each of those CILs to see how they did individually on that tool. And it has been -- we have actually used the same tool for about, we just kind of tweaked it a year and a half ago. But we have been using the same tool for about the last six years. And we asked about services and satisfaction on services and we asked one open ended question of how it changed, what change did this make. So there's open-ended information and we always have done prior to our legislative session to take that to our legislators as well and use it for education of the legislature. MICHAEL HENDRICKS: Just my personal opinion but it feels like there might be learning we can do from this Missouri model. I don't know if you have written that up, if you are willing to write that up and share it with people on the web site, the SILC Congress web site, just a brief one or two page thing because it sounds like you have something working quite nicely and maybe some of the rest of us could learn from that, my thought. I'd love it if you did that. AUDIENCE MEMBER: What helps is we actually have a SILC committee that works on which Chris chairs. So we look at when CILs have complaints about the survey or what's going on and that of the tweaking and the Survey Monkey has come in and where now we kind of do it all yearlong instead of having just a certain collection time period. MICHAEL HENDRICKS: So maybe that's part of what you can put in your one or two page thing. It is important to have a committee of the SILC that deals with this. Survey Monkey is a free, I think, way to gather this or cheap. It is cheap. If you are willing to just put something on paper, the rest of us could learn from I think that could be -- it is my own opinion, that could be really valuable. AUDIENCE MEMBER: I just wonder if we can get a copy, a copy of the instrument that they are using. MICHAEL HENDRICKS: Maybe that also, if they are willing. AUDIENCE MEMBER: The report is on our web site. The report, your web site is? www.mosilc.org. So it is just the abbreviation for Missouri and I guess that's standard, silc.org. That's standard. That's great. See, isn't it great to come together and learn stuff like this. I think Bob and I are stunned because that's, you're number 22 CILs is exactly the same number of CILs around the country that we have in our pilot test for field testing and we did exactly the same thing that you did there. What are the odds of that, Bob, happening? Amazing. You are in the new field test. Super. What else? Anybody else? Shall I go on? I do have to put you to work here, don't I? We are having such good work together but I need to put you to work. So explicitly avoid CIL competition or gotcha. Don't identify individual CILs in the analysis. We have been hearing here that's an important thing to avoid. Give all findings to everyone, no secrets although that is not individual findings but the overall findings, and plan. You know, you are going to think this is unnecessary but I have been in this business a long time. If you don't plan how you are going to use the data it won't happen. So plan how you are going to use the data, and I like to have three steps. Do we believe it? What's it mean for us? And what are we going to do about it? If you can get people to sit down on any piece of information and say do we believe it, what's it mean for us and what are we going do about it, you are a long way to use. And here are some online resources to learn more and -- online resources to learn more and they're in your handout. Now, it is your turn. This is what I what you to do about your table. We have thrown out some ideas. We have a question, sorry. AUDIENCE MEMBER: Can I ask a question about sample size. Is any randomly generated sample size by virtue of being randomly generated statistically relevant? MICHAEL HENDRICKS: What do you think my answer is going to be. AUDIENCE MEMBER: It depends. MICHAEL HENDRICKS: Yes, it does. AUDIENCE MEMBER: I was hoping you can add a little bit to the end of that though. MICHAEL HENDRICKS: Yeah, I will add a little bit to that. It depends and it is really complicated. It depends on the kinds of question you are asking, you're asking of people, it depends on the kinds of responses you're asking back. It depends on the variability that you're expecting in advance of what those answers are going to be. It depends on the confidence you want to have in the answers you get back. It is just really complicated stuff. I am not one of these people but there are people who specialize in that. They specialize in sampling. And there's got to be one in a university near you who could give you some good advice on it but that's a complicated question. Just make sure we know what we are talking about here. Here is the question. If you have got 1,000 people you have served, is 100 enough to sample? What we said is we will either telephone or we will send out a questionnaire to 100 of these 1,000 you are saying well, is 100 enough, does it have to be 300, 250, you know somewhere in here is a magic number, and this can be calculated depending on the situation, but it gets very complicated, very tricky, you need an expert to advise you. and I am not that expert. So your assignment, we have thrown out some ideas, you have thrown out even more. You have shared even more. This is what I want you to do. At your table, kind of discuss what we have been talking about here, and talk about what you think makes sense to do in your state. I would love for you to come back with three first steps to do in your state, okay. Three first steps to do with tackling this whole thing of consumer satisfaction. So do that for a bit and then we will hear what we have come up with. If we don't have practical first steps this will slide away this afternoon. We need practical first steps to do something here. But just, and anybody at all, anybody who had at their table, they develop, this would not be a bad first step. Shout them out. AUDIENCE MEMBER: I see a hand over here. okay. First of all, I think you have to determine who's going to do it. MICHAEL HENDRICKS: Who's going to do what? AUDIENCE MEMBER: The consumer satisfaction consumer survey. MICHAEL HENDRICKS: Okay. Determine who will measure consumer satisfaction. okay. AUDIENCE MEMBER: That would be collectively I mean probably like being brought up at a SILC meeting or something of that nature. MICHAEL HENDRICKS: Okay. So talk about, talk about consumer satisfaction with the SILC, with the rest of the SILC. Okay. AUDIENCE MEMBER: We have a meeting in two weeks, a SILC meeting in two weeks and we have a bright new shiny evaluation committee. So the evaluation committee is going to meet. We have our committee meetings the day before the SILC meetings. That's our first step is to, one of the other people that is here is on the evaluation committee. So we get to talk about what we learned today, this week. MICHAEL HENDRICKS: okay. And I am going -- so you are saying the SILC evaluation committee if there is one or create one perhaps, should think about consumer satisfaction and I will say oh, here is a chance for me to draw. I cannot draw worth anything. So this is going to be the world's worst three-legged stool. it looks great. isn't that the world's worst three-legged stool. it is not so bad. Anyway, we are talking about three all legs of the stool. Over here we have one. Others? If you are close enough to shout out I can repeat it. okay. AUDIENCE MEMBER: I would say that the IL partners should sit down at the table collectively and discuss it and decide. MICHAEL HENDRICKS: Not just the SILC but IL partners discuss this, okay. AUDIENCE MEMBER: I am going to bring back samples of the Missouri tool and the West Virginia tool and there was a third tool. MICHAEL HENDRICKS: Let's find a way to formalize that so everybody can gather that. So we are all going to maybe skim the Missouri web site. You are saying I know the Missouri people already offered completely spontaneously on their own to write up a little thing about this; right. And where are they going to post that so we can all read that. Where's that going to be? Is that going to be on your web site. It is going to be on the web site, so one to two pager if we can get them to do it, the Missouri system. You said also West Virginia. AUDIENCE MEMBER: West Virginia. MICHAEL HENDRICKS: uh-huh. AUDIENCE MEMBER: Then I also thinking about calling KU. MICHAEL HENDRICKS: Kansas University. yes. And then bring that up at the next association meeting with the centers and let them evaluate. MICHAEL HENDRICKS: Is there some way that you can do this wonderful work and put it so that other people can have access to it? I am just wondering. I am just wondering on maybe the SILC congress web site or something or your state web site? AUDIENCE MEMBER: Yeah I have none of that information. I am going to collect it, but sure I would be happy to put it back up. MICHAEL HENDRICKS: Another spontaneous offer to write something up, what a wonderful person. There was one. What other first steps when you get back to your state? Here is one, Brad. We have one back here, Brad first. AUDIENCE MEMBER: We will look into using a SPS data base to apply data across questions to determine if there are any trends. MICHAEL HENDRICKS: Brad has just raised a really interesting point is when you are taking this or whoever in your state is doing it. Is taking this consumer satisfaction data and putting it into a computer presumably, there are different kinds of ways you can store it in the computer. Some are much easier to then slice and dice it in a way that's useful to you, and you are saying you are going to look at one particular way which is SPSS, statistical package for the social sciences which is a very good slice and dice way. There are others too. But you are saying we will pay attention to the data base we put this into to make sure we can use it; is that correct? Okay. Good. I think that's real important. Over here I think it is. AUDIENCE MEMBER: Is it me? MICHAEL HENDRICKS: It is. AUDIENCE MEMBER: This is Sheryl. We decided and I am from Hawaii, but I am at a table from West Virginia. So, I will just speak for the table, decided one thing we were going to do is to offer the, is to look at making sure that the survey is available in a variety of formats, for example we might use a telephone call. We might use the web, and the written format. But to also make sure that people can get the surveys in alternate format instead of always being sent out standard print. MICHAEL HENDRICKS: Okay. So all -- a variety of formats as is appropriate. That's excellent, okay. Here is one back here Here is another first step, I think from West Virginia. AUDIENCE MEMBER: Yeah. Can you all hear me? I am Nathan Parker from West Virginia. At our next SILC meeting, which will be in February, we are going to bring our, there's a lady, she does our consumer satisfaction surveys, she will be at our next meeting. MICHAEL HENDRICKS: So you hire a consultant. AUDIENCE MEMBER: Yeah, we have a consultant and her name is Danetta Dowler from West Virginia University Centers of Excellence and Disabilities. And one thing that we are going to do is that Ann mentioned, we are going to see how we can revisit our surveys. I mean, it is like five years old. MICHAEL HENDRICKS: I think that's good AUDIENCE MEMBER: And see what we can do maybe to change some of it and revisit. MICHAEL HENDRICKS: A consultant is coming to the SILC meeting; is that correct? AUDIENCE MEMBER: Yes. and our -- well, she is coming to our committee meetings. She will meet with the SPIL team too. MICHAEL HENDRICKS: Okay. Great. AUDIENCE MEMBER: Which is part the. We have a advocacy committee, executive committee, administrative committee and the action committee. The action committee does the SPIL. MICHAEL HENDRICKS: Okay. Good. Other first tasks for anybody, first steps? Okay. Berta-first steps? Well then this is a good list of first things to go back and do. So let me wrap up if I can the discussion of consumer satisfaction and see if we can summarize it a little bit. You've got to do it. You're telling me it has been a while since you've really kind of thought about how you are doing it. I am looking at some of the stuff and not being 100% happy, and you're not either, I think when you're honest. So it sounds like this would not be a bad time to go back and just ask ourselves hey okay, someone said revisit it, revisit the issue. It sounds like this might be a good time to revisit the issue. Don't forget that the handout you are taking back has specific suggestions in it for you. So for what they're worth, at least consider them. That's all we ask. Over here. AUDIENCE MEMBER: And we are talking about a central web site or way that people who are doing this successfully could put all of this information and those of us who want to go on the web and access it, there's a lot of good ideas from that. MICHAEL HENDRICKS: There are a lot of good ideas. Tell me if I am wrong, is there not a SILC congress web site. AUDIENCE MEMBER: There is,. Yes, there is. MICHAEL HENDRICKS: Is it not possible to post information there or post links to state web sites that are relevant? Is that not possible? AUDIENCE MEMBER: Yes, it is. I believe that's the goal of the web site, is it? Is developed and move forward. MICHAEL HENDRICKS: Doesn't it seem like that would be the logical place since we are the SILC congress meeting and coming out of that. It is not my place to say. AUDIENCE MEMBER: That's where all of the material that we have ahead of time was placed for people to download from there. MICHAEL HENDRICKS: Bob has a thought too. Seems to me that's a logical place for people to go look, but I don't know. AUDIENCE MEMBER: This doesn't really have anything to do with that. But this is directed to Tim. One of the concerns that we had, when I go back and I try to convince the centers that they need to participate in this process, they need to know that RSA is not going to use the centers- specific data to harm them. So that -- that RSA will not come back later and say well we have got a right to that information or something along that line. MICHAEL HENDRICKS: We don't have to put Tim on the spot but if he wants to reply. AUDIENCE MEMBER: I don't have a specific answer, but I will check on that for you. We have different things we look at obviously, 704 reports, that type of thing. So we don't necessarily look specifically at individuals customer satisfaction surveys unless we actually go into the centers for a center review. So I could check into that a little more for you, but that is my answer. MICHAEL HENDRICKS: Thanks. I think, Bob, in a general sense, I think Bob is reraising a really, really important point that we all realize is we have got to do this in a way that everybody is comfortable with; right. In a way everybody is comfortable with. AUDIENCE MEMBER: This is Bob. I would like to ask Tim, since you do your site visit and you review the satisfaction reports, what, then, you know, how would you use whatever information that you gather from that satisfaction report? To determine what outcomes? AUDIENCE MEMBER: I can't speak for RSA right now, but when I did reviews in California, we were more concerned with how you collected the data, what you used the data for, how it was reported. Basically everything Mike is teaching here in the training, we were looking at it more at that angle. Obviously, if 90% of your surveys are negative, then we would want to discuss what the issues are with that. RSA, I am not too familiar with that, because I haven't really participated in their actual reviews yet. We will be doing that this year. But that's the way we operated in California. That's the way we approached it, looking at more of the usage, collection, and what it is for, that type of thing. MICHAEL HENDRICKS: That, Tim's answer makes me very happy because what I heard him say was, when he was in charge of something, he made sure the data was being used. And that is very heartening. I really like that. So that's great. Okay. Any other thing about this because what I want to do is let's take a little wrap up of the day and a half. Remember what we said, evaluating your SPIL is the next thing on your plate. You have now done the SPIL. Now it is time to turn your attention to evaluating it. You've got to do it. We showed you that. You've got to absolutely do it. And you've got to do all three legs of that stool. So I -- you know we have thrown out ideas, you'll improve them, but you've got to do all three legs of that stool in one way or another. So this information will be posted on the web. You can go back and look at it eventually. I am sure there's lots of people at ILRU and NCIL and other places that will be happy to, you know help you find ways to get better at this. But it is on your plate and you have to do the evaluation.