Loren Data's SAM Daily™

fbodaily.com
Home Today's SAM Search Archives Numbered Notes CBD Archives Subscribe
FBO DAILY ISSUE OF MARCH 08, 2006 FBO #1563
SOLICITATION NOTICE

A -- Software and Systems Test Track

Notice Date
3/6/2006
 
Notice Type
Solicitation Notice
 
NAICS
541710 — Research and Development in the Physical, Engineering, and Life Sciences
 
Contracting Office
Department of the Air Force, Air Force Materiel Command, AFRL - Rome Research Site, AFRL/Information Directorate 26 Electronic Parkway, Rome, NY, 13441-4514
 
ZIP Code
13441-4514
 
Solicitation Number
Reference-Number-BAA-06-13-IFKA
 
Description
FUNDING OPPORTUNITY NUMBER: BAA #06-13-IFKA CFDA Number: 12.800 DATES: It is recommended that white papers be received by the following dates to maximize the possibility of award: FY 06 should be submitted by 30 March 2006; FY 07 by 31 January 2007; FY 08 by 31 October 2007; FY 09 by 31 October 2008 and, FY 10 by 31 October 2009. White papers will be accepted until 2:00 p.m. Eastern time on 30 March 2010, but it is less likely that funding will be available in each respective fiscal year after the dates cited. FORMAL PROPOSALS ARE NOT BEING REQUESTED AT THIS TIME. See Section IV of this announcement for further details. I. FUNDING OPPORTUNITY DESCRIPTION: Background: It is increasingly difficult to create software to deal successfully with increased device and system complexity. Moreover, networked distributed systems create vast increases in the scale and scope of information processing applications, exacerbating the challenges to the system engineers' ability to specify, design, build, verify and test software. This situation is an emerging issue in information technology in general, but the requirement of military systems set them sharply apart from non?military applications in terms of reliability, robustness, security, interoperability and real time operation. Business, government, and technical endeavors ranging from financial transactions to space missions increasingly require complex software systems to function correctly. The complexity of the software arises from stringent requirements (e.g., for reliability in performance and integrity of the data used), the need to support a range of interactions with the environment in real time, and/or certain structural features. These attributes make software difficult to produce. Major hardware-software failures in defense acquisition programs have occurred both because components were expected to interoperate properly and they did not, as well as the fact that tools did not work as advertised. Interoperability is required for network centric environments but reality has shown it very difficult to achieve. A scalable, flexible, realistic synthetic testing environment is required for stressing tools against key benchmarks, assessment of the utility of tools by key program offices as well as use as a synthetic environment for testing tools against large systems (tens of millions of SLOC or larger) or systems-of-systems. Objective: This program will be acquired in two Phases. Phase I will be the definition phase and will consist of defining, developing, and documenting the Concept of Operations (CONOPS), a user-oriented document that describes system characteristics for a proposed system from the users' viewpoint; and Defining, developing and documenting the architecture and the fundamental organization of the Systems and Software Test Track as embodied in its components, their relationships to each other and the environment, and the principles governing its design and evolution. Initial concepts along with the final CONOPS and architectures will be presented to representatives from Government, Industry and Academia and therefore, the technical data rights proposed need to be consistent with this requirement. Phase I is the area that we are soliciting white papers for now. Phase II will be the environmental development and operations phase. Additional information will be issued around the January 2007 timeframe concerning Phase II and white papers for Phase II will be solicited at that time. The overall objective of this Systems and Software Test Track BAA program is to provide an open framework environment where an assortment of experimental tools and products may be deployed and allowed to interact in real-time, interactive, evolutionary and interdependent means, thereby allowing rigorous testing of new technologies, methodologies and theories in support of the Software-Intensive Systems Producibility Initiative. The Systems and Software Test Track will facilitate testing of Software-Intensive Systems Producibility research products and methods, provide an environment for research of DoD embedded systems and software problems, provide an ability for university and industry leverage of technology development, and establish a capability for successful technology transition and transfer. The environment should be open and available for use by developers as well as independent analysis by the facility operators. This independent analysis allows the facility operators to be supportive to major defense acquisition program offices as well as analyzing the utility of tools. Program offices may bring their unsolved problems to the test track for either help in solving or by looking for needed utility amongst the tools available. Lastly, this synthetic environment should provide a place where big codes can be tested in the loop allowing requirements verification prior to production and deployment. This risk reduction affords the ability to verify and validate functionality of today's complex software intensive systems while providing a realistic environment for researchers to verify their tools against realistic problems. The Systems and Software Test Track should also provide a place (possibly virtual and not a single physical location) for experimental verification of Software-Intensive Systems Producibility technologies due to their novelty and the potential complexity of the underlying theories. The experimental platforms should incorporate software technology to instrument, monitor and test large-scale applications. Challenge problems for the open experimental platforms should be made accessible for all the research teams. The experimental platform research should include subtasks to conduct large-scale coordination experiments, and to develop methods and tools for evaluating aggregate performance of applications. This environment should provide a full range of collaborative technology challenges, run-time platforms and applications, experiments, evaluations, and demonstrations. A Common infrastructure will enable control and data flow between both kinds of application components for a distributed environment. The open experimentation environment will provide the fundamental reference architecture and underpinnings helping researchers to develop and test their designs as well as facilitates transition of promising technologies into production use. Research Concentration Areas: The goal of the Phase 1 research is to A) Define, develop, and document the Concept of Operations (CONOPS), a user-oriented document that describes system characteristics for a proposed system from the users' viewpoint; and B) Define, develop and document the architecture and the fundamental organization of the Systems and Software Test Track as embodied in its components, their relationships to each other and the environment, and the principles governing its design and evolution. The CONOPS document should be written to communicate overall quantitative and qualitative system characteristics to the user, buyer, developer, and other organizational elements. It should describe the system, operational policies, classes of users, interactions among users, and organizational objectives from an integrated systems point of view. Areas to consider when developing the CONOPS include, but are not limited to: Intellectual property issues; International Traffic in Arms Regulations (ITAR) and export control issues; Tool Evaluation; Features and methods to promote and monitor practitioner and researcher interaction as well as providing academic researchers access to large complex software systems; Technology Transition/Transfer of improved embedded software component integration tools and techniques into military system development programs; Supporting facilities and staffing, location(s), networking, firewall and policy related issues. The Architecture definition should provide the ability to understand and control those elements of system design that capture the system's utility, cost, and risk. These elements could be the physical components of the system and their relationships, the logical components or enduring principles or patterns that create enduring structures. The definition shall provide a rigorous definition of what constitutes the fundamental organization of the Systems and Software Test Track embodying all information regarding elemental relations, interactions and interfaces. Areas to consider when developing the Architecture include, but are not limited to: The environment, virtual or not, to host, execute and test technology, both hardware environment (run time platform) and software environment (run time platform); Interfaces and collaborations; Mechanisms to support research and analysis of DoD problems including, but not limited to: software artifacts, benchmarks, executables, source code, design documents, requirements documents, examples, models, fault data, lessons learned, software construction files and tools; Data repositories for results, success stories, benchmarks, quantitative and qualitative results, software disaster studies defining problems and basic research areas, ability to upload/download artifacts; Measurement Techniques and Software Forensics; Metrics including reduced development time; ease in which domain experts and software engineers can interact; ease in which different domain experts can specify and design code independently of one another; usability, i.e., the 'naturalness' of the modeling language from which code is generated; ability of test engineers to modify and tune code in the field; accurate automated documentation in design Mechanism for studying innovative systems and dynamic processes, not static snapshots References: Institute for Electrical and Electronics Engineers, IEEE Guide for Information Technology-System Definition-Concept of Operations (CONOPS) Document. IEEE Std 1362-1998, IEEE Computer Society Press, 1998. Anticipated Schedule: Phase 1 ? The Definition Phase will begin with a kickoff workshop and proceed with a several month CONOPS and Architecture development for the Systems and Software Test Track. A review workshop will be held mid-way through the phase I effort where the initial concepts can be briefed to representatives from Government, Industry and Academia invited to review the ideas and provide feedback. The final months should be devoted to completing the initial concepts and incorporating comments from the review workshop. A final review workshop should be held in the final month where the final CONOPS and Architectures can be presented to representatives from Government, Industry and Academia invited to review the ideas and provide feedback. Phase 1 awardees will have to present the full extent of their work at a mid-term and final workshop, attendance at which will not be limited to only the Government and awardees. Also, the Phase 1 work will provide the basis for the phase 2 solicitation, and be used both by the Government for definition of the Phase 2 addendum, and as a repository available for phase 2 offerors use. Therefore, the technical data rights proposed need to be consistent with these requirements. Phase 2 ?Additional information will be provided as a modification to this BAA to initiate the phase 2 which will consist of the Development and Operations phase. This modification is anticipated to be issued in January 2007. II. AWARD INFORMATION: Total funding for this BAA is approximately $18M. The anticipated funding to be obligated under this BAA is broken out by fiscal year as follows: FY 06 - $1.0M; FY 07 - $1.0M; FY 08 - $6.1M; FY 09 - $4.9M; and FY10 - $5.0M. Individual awards will not normally exceed 6 months with dollar amounts ranging between $300K to $400K per year for Phase 1 and will not normally exceed 18 months with dollar amounts ranging between $700K and $3.0M per year for Phase II. Awards of efforts as a result of this announcement will be in the form of contracts, grants, cooperative agreements, or other transactions depending upon the nature of the work proposed. III. ELIGIBILITY INFORMATION: 1. ELIGIBLE APPLICANTS: All potential applicants are eligible. Foreign or foreign-owned offerors are advised that their participation is subject to foreign disclosure review procedures. Foreign or foreign-owned offerors should immediately contact the contracting office focal point, Lori L. Smith, Contracting Officer, telephone (315) 330-1955 or e-mail Lori.Smith@rl.af.mil for information if they contemplate responding. The e-mail must reference the title and BAA 06-13-IFKA. 2. COST SHARING OR MATCHING: Cost sharing is not a requirement. IV. APPLICATION AND SUBMISSION INFORMATION: 1. APPLICATION PACKAGE: THIS ANNOUNCEMENT CONSTITUTES THE ONLY SOLICITATION. WE ARE SOLICITING WHITE PAPERS ONLY. DO NOT SUBMIT A FORMAL PROPOSAL AT THIS TIME. Those white papers found to be consistent with the intent of this BAA may be invited to submit a technical and cost proposal. See Section VI of this announcement for further details. For additional information, a copy of the AFRL/Rome Research Sites "Broad Agency Announcement (BAA): A Guide for Industry," Aug 2005, may be accessed at: http://www.if.afrl.af.mil/div/IFK/bp-guide.doc. 2. CONTENT AND FORM OF SUBMISSION: Offerors are required to submit 4 copies of a 3 to 5 page white paper summarizing their proposed approach/solution. The purpose of the white paper is to preclude unwarranted effort on the part of an offeror whose proposed work is not of interest to the Government. The white paper will be formatted as follows: Section A: Title, Period of Performance, Estimated Cost of Task, Name/Address of Company, Technical and Contracting Points of Contact (phone, fax and email); Section B: Task Objective; and Section C: Technical Summary and Proposed Deliverables. Multiple white papers within the purview of this announcement may be submitted by each offeror. If the offeror wishes to restrict its white papers/proposals, they must be marked with the restrictive language stated in FAR 15.609(a) and (b). All white papers/proposals shall be double spaced with a font no smaller than 12 pitch. In addition, respondents are requested to provide their Commercial and Government Entity (CAGE) number, a fax number, and an e-mail address with their submission. All responses to this announcement must be addressed to the technical POC, as discussed in paragraph five of this section. 3. SUBMISSION DATES AND TIMES: It is recommended that white papers be received by the following dates to maximize the possibility of award: FY 06 should be submitted by 30 March 2006; FY 07 by 31 January 2007; FY 08 by 31 October 2007; FY 09 by 31 October 2008 and, FY 10 by 31 October 2009. White papers will be accepted until 2pm Eastern time on 30 March 2010, but it is less likely that funding will be available in each respective fiscal year after the dates cited. Submission of white papers will be regulated in accordance with FAR 15.208. 4. FUNDING RESTRICTIONS: The cost of preparing white papers/proposals in response to this announcement is not considered an allowable direct charge to any resulting contract or any other contract, but may be an allowable expense to the normal bid and proposal indirect cost specified in FAR 31.205-18. Incurring pre-award costs for ASSISTANCE INSTRUMENTS ONLY, are regulated by the DoD Grant and Agreements Regulations (DODGARS). 5. OTHER SUBMISSION REQUIREMENTS: DO NOT send white papers to the Contracting Officer. All responses to this announcement must be addressed to: Department of the Air Force, Air Force Material Command AFRL/IFTC 525 Brooks Road, Rome, NY 13441-4505 Attn: Mr. Steven Drager Respondents are required to provide their Dun & Bradstreet (D&B) Data Universal Numbering System (DUNS) number with their submittal and reference BAA 06-13-IFKA. Electronic submission to Steven.Drager@rl.af.mil will also be accepted (if submitting electronically, you do not need to follow up by sending in 4 hard copies). V. APPLICATION REVIEW INFORMATION: 1. CRITERIA: The following criteria, which are listed in descending order of importance, will be used to determine whether white papers and proposals submitted are consistent with the intent of this BAA and of interest to the Government: (1) Overall Scientific and Technical Merit -- Including the approach for the development of the CONOPS and architecture, (2) Related Experience - The extent to which the offeror demonstrates relevant technology and domain knowledge, (3) Maturity of Solution - The extent to which existing capabilities and standards are leveraged and the relative maturity of the proposed technology in terms of reliability and robustness, and (4) Reasonableness and realism of proposed costs and fees (if any). Also, consideration will be given to past and present performance on recent Government contracts, and the capacity and capability to achieve the objectives of this BAA. No further evaluation criteria will be used in selecting white papers/proposals. Individual white paper/proposal evaluations will be evaluated against the evaluation criteria without regard to other white papers and proposals submitted under this BAA. White papers and proposals submitted will be evaluated as they are received. 2. REVIEW AND SELECTION PROCESS: Only Government employees will evaluate the white papers/proposals for selection. The Air Force Research Laboratory's Information Directorate has contracted for various business and staff support services, some of which require contractors to obtain administrative access to proprietary information submitted by other contractors. Administrative access is defined as "handling or having physical control over information for the sole purpose of accomplishing the administrative functions specified in the administrative support contract, which do not require the review, reading, or comprehension of the content of the information on the part of non-technical professionals assigned to accomplish the specified administrative tasks." These contractors have signed general non-disclosure agreements and organizational conflict of interest statements. The required administrative access will be granted to non-technical professionals. Examples of the administrative tasks performed include: a. Assembling and organizing information for R&D case files; b. Accessing library files for use by government personnel; and c. Handling and administration of proposals, contracts, contract funding and queries. Any objection to administrative access must be in writing to the Contracting Officer and shall include a detailed statement of the basis for the objection. VI. AWARD ADMINISTRATION INFORMATION: 1. AWARD NOTICES: Those white papers found to be consistent with the intent of this BAA may be invited to submit a technical and cost proposal. Notification by email or letter will be sent by the technical POC. Such invitation does not assure that the submitting organization will be awarded a contract. Those white papers not selected to submit a proposal will be notified in the same manner. Prospective offerors are advised that only Contracting Officers are legally authorized to commit the Government. All offerors submitting white papers will be contacted by the technical POC, referenced in Section VII of this announcement. Offerors can email the technical POC for status of their white paper/proposal no earlier than 45 days after proposal submission. 2. ADMINISTRATIVE AND NATIONAL POLICY REQUIREMENTS: Depending on the work to be performed, the offeror may require a TOP SECRET facility clearance and safeguarding capability; therefore, personnel identified for assignment to a classified effort must be cleared for access to TOP SECRET information at the time of award. In addition, the offeror may be required to have, or have access to, a certified and Government-approved facility to support work under this BAA. Data subject to export control constraints may be involved and only firms holding certification under the US/Canada Joint Certification Program (JCP) (www.dlis.dla.mil/jcp) are allowed access to such data. A phase 1 awardee will have to present the full extent of their work at a mid-term and final workshop, attendance at which will not be limited to only the Government and awardees. Also, the Phase 1 work will provide the basis for the phase 2 solicitation, and be used both by the Government for definition of the Phase 2 addendum, and as a repository available for phase 2 offerors use. Therefore, the technical data rights proposed need to be consistent with these requirements. 3. REPORTING: Once a proposal has been selected for award, offeror's will be required to submit their reporting requirement through one of our web-based, reporting systems known as JIFFY or TFIMS. Prior to award, the offeror will be notified which reporting system they are to use, and will be given complete instructions regarding its use. VII. AGENCY CONTACTS: Questions of a technical nature shall be directed to the cognizant technical point of contact, as specified below: TPOC Name: Steven Drager Telephone: (315) 330-2735 Email: Steven.Drager@rl.af.mil Questions of a contractual/business nature shall be directed to the cognizant contracting officer, as specified below: Lori Smith Telephone (315) 330-1955 Email: Lori.Smith@rl.af.mil The email must reference the solicitation (BAA) number and title of the acquisition. In accordance with AFFARS 5315.90, an Ombudsman has been appointed to hear and facilitate the resolution of concerns from offerors, potential offerors, and others for this acquisition announcement. Before consulting with an ombudsman, interested parties must first address their concerns, issues, disagreements, and/or recommendations to the contracting officer for resolution. AFFARS Clause 5352.201-9101 Ombudsman (Aug 2005) will be incorporated into all contracts awarded under this BAA. The AFRL Ombudsman is as follows: Jeffrey E. Schmidt Colonel, USAF Director of Contracting (937) 255-0432 (voice) (937) 255-5036 (fax) All responsible organizations may submit a white paper which shall be considered.
 
Record
SN01000695-W 20060308/060306212136 (fbodaily.com)
 
Source
FedBizOpps Link to This Notice
(may not be valid after Archive Date)

FSG Index  |  This Issue's Index  |  Today's FBO Daily Index Page |
ECGrid: EDI VAN Interconnect ECGridOS: EDI Web Services Interconnect API Government Data Publications CBDDisk Subscribers
 Privacy Policy  Jenny in Wanderland!  © 1994-2024, Loren Data Corp.