Loren Data's SAM Daily™

fbodaily.com
Home Today's SAM Search Archives Numbered Notes CBD Archives Subscribe
FBO DAILY ISSUE OF JANUARY 09, 2003 FBO #0403
MODIFICATION

R -- Employee Attitude Survey

Notice Date
1/7/2003
 
Notice Type
Modification
 
Contracting Office
Defense Contracting Command-Washington(DCC-W), ATTN: Policy and Compliance, 5200 Army Pentagon, Room 1D245, Washington, DC 20310-5200
 
ZIP Code
20310-5200
 
Solicitation Number
DASW01-03-T-0000
 
Response Due
1/7/2003
 
Archive Date
3/8/2003
 
Point of Contact
TaLisa Spottswood, 703-602-3661
 
E-Mail Address
Email your questions to Defense Contracting Command-Washington(DCC-W)
(talisa.spottswood@hqda.army.mil)
 
Small Business Set-Aside
N/A
 
Description
NA The purpose of this notice is to answer questions submitted in response to the Sources Sought and to notify Responders that the Government is revising the statement of work, working on an acquisition plan, and anticipates releasing a solicitation some time in February. ========================================= Question 1. The contract award amount is expected to be $100,000. You also state that the ?contractor may be required to travel to Army facilities to conduct the feedback and planning sessions.? Does the $100,000 include travel? Are we to assume that t ravel will be reimbursed according to the Federal Travel Regulation? The $100,000 does not include travel for the feedback and planning sessions. Those costs will likely be paid for by the requesting organizations. Army may add resources to the existing contract or require the contractor directly bill the requesting organ ization. In either case, travel will be reimbursed according to the Federal Travel Regulation. Question 2. You state, ?the contractor may be required to host up to 2 1/2 day sessions?. Does this mean 2 sessions, each of which is a half a day (4 hours) long? If not, what does this mean? If the contractor hosts the training sessions, does this p reclude travel to Army facilities? If so, how should offerors price this work to account for both possibilities? Or, if you anticipate a contractor hosting training sessions as well as traveling to Army facilities, how should this be priced? Feedback and planning sessions could range from ? day (four hours long) to 2 ? days (20 hours long), based on experience with our previous contract and the needs of the requesting Army organization. If Army feels it is more desirable to travel to the cont ractor facilities, then we will require the contractor to travel to Army facilities. Contractors should describe all costs in the cost proposal based on both possibilities. Question 3. In stating the $100,000 ceiling, you mention that the length of the contract will extend a year. What levels of effort do you anticipate over the yearlong timeframe (e.g., consistent level of effort across the entire time frame, or peaks and valleys based on specified milestones)? The level of effort will be most during the second, third, and fourth months of the contract, through the end of the data-reporting phase of the contract. Question 4. The SOW indicates a need to ?show change from the FY01 survey results?. However, we were told data from the 2001 survey will not be available. Will this require additional hand entry of summary data from the FY01 survey? If so, it would be very time consuming. At the very least, Army will provide an ASCII database containing FY01 survey raw data. Question 5. The SOW indicates Task 4 costs should be included in the cost proposal. Are these costs intended to be included in the $100,000 project cost estimate? No, Task 4 is onsite consultation that should be considered optional, at the request of specific Army organizations over and above the $100,000 ceiling. Question 6. Were FY01 data analyzed to improve the quality of the survey instrument? With respect to the survey items, most of the supplemental, non-core items will differ from the FY01 survey. Some core items also may be reviewed and revised. Concerning the web-based system per se, Army analyzed open-ended comments made by survey partic ipants. We will review this analysis and identify ?lessons learned? as we develop the FY03 web survey system. Question 7. Key driver analysis is not an automated process. It must be done individually for each reporting unit for employees (400) and supervisors (400) and entered into each report as stand-alone data. The $100,000 estimate allows for only $250.00 p er report and tasks one and two plus key driver analysis. This is not realistic. We acknowledge that key driver analysis is not a totally automated process. However, the successf ul contractor should be able to come up with a creative method of automating as much of the process as possible. For example, because Army requires a fairly large number of key driver reports, the contractor may be able to come up with reasonable automate d rules for identifying breakpoints across Army and apply them to individual Army organizations. Question 8. Using just a web-based survey introduces a bias to the survey as it excludes the following: a) those respondents with no access to the internet; b) those with limited access to the internet; and c) individuals who do not trust the computers an d will not give any information out on the internet. This bias is particularly crucial when evaluating employee attitudes, as these individuals might perceive their work place differently from those who are responding. Thus, limiting the survey to be pur ely web-based is introducing this non-respondent bias in the results making them less credible. Army analyzed FY01 survey results against the survey population and found them to be reasonably representative. Points concerning trust and access to the web are well taken. In FY01, we asked all Army supervisors, especially supervisors of wage grade emp loyees that have limited access to computers, to ensure all employees have access to the web-based survey. We included a letter from the Secretary of the Army addressing the security issue. Question 9. The draft SOW does not say a lot about the Web survey and whether or not usability testing has been conducted in the past or whether it might be of value this time. However, it could provide some valuable information for DCC-W. For example, if confidentiality was an issue in previous administrations, one could conduct usability testing to determine how much of an issue that is, explore possible solutions, and offer recommendations on this or any other ?usability? issues. Hypothetically, woul d the owner of the attitude survey be interested in some form of usability testing? Usability testing of web-based surveys is a relatively new area of research. Generally speaking, it refers to the methods (e.g., focus groups, one-on-one interviews, field observations, and surveys) that are used to evaluate usability. The usability of a web-based survey can greatly impact the experience of the users of the system, and can contribute to a variety of problems, including user confusion, non-response, and confidentiality concerns. This may ultimately impact overall system credibility. Army would be interested in incorporating some form of usability testing within the confines of the present budget ceiling. For example, analysis of FY01?s open-ended comments may be used to evaluate usability so that Army could create marketing pieces (perha ps in the form of Frequently Asked Questions) that serve to alleviate some of the potential problems. Question 10. The draft SOW mentions open-ended comments but does not say that comments will need to be tied to actual survey responses. Is that something that will be necessary in the next administration? If so, perhaps pre-administration focus groups o r some other qualitative approach (e.g., one-on-one or telephone interviews) could help determine the level of concern and ways to alleviate that concern. As mentioned above, this could also be explored via usability testing. The FY01 open-ended comments were not tied to actual survey responses. However, it may be useful to tie comments to certain important Army demographics. This is something that contractor should explore with Army during Task 1, when the contractor is cons ulting with Army as we develop the web-based survey system. Question 11. DMDC has been doing some interesting work with non-responders. A brief non-responder survey may shed light on why some Army civilians choose not to respond and could provide valuable insight into ways to increase future response rates. Is t his something that may be of interest to the survey process owner? Yes, within the specifications of the first task. Question 12. The draft SOW mentions 2 ? days onsite feedback and action planning sessions and the desire to end those days with a specific plan of action. Would the process owner be open to alternative approaches? For example, contractor could come out to do feedback and action planning training in one day. After some time (say 2 weeks), contractor returns and helps develop action plan. The 2 weeks in between would allow trainees to go out and gather information from their reports and bring that back t o the second meeting. Alternatively, if budget is an issue, day 2 could be handled via conference call. Are those alternative solutions of interest? The contractor should list alternative solutions such as the one described above. It is important that the contractor work with the survey process owner and the requesting Army organization to determine what is best.
 
Place of Performance
Address: Defense Contracting Command-Washington(DCC-W) ATTN: Policy and Compliance, 5200 Army Pentagon, Room 1D245 Washington DC
Zip Code: 20310-5200
Country: US
 
Record
SN00235002-W 20030109/030107213529 (fbodaily.com)
 
Source
FedBizOpps.gov Link to This Notice
(may not be valid after Archive Date)

FSG Index  |  This Issue's Index  |  Today's FBO Daily Index Page |
ECGrid: EDI VAN Interconnect ECGridOS: EDI Web Services Interconnect API Government Data Publications CBDDisk Subscribers
 Privacy Policy  © 1994-2020, Loren Data Corp.