Jump to content
The Wifcon Forums and Blogs

formerfed

Members
  • Content Count

    762
  • Joined

  • Last visited

Community Reputation

0 Neutral

About formerfed

  • Rank
    Contributing Member

Profile Information

  • Gender
    Not Telling

Recent Profile Visitors

12,707 profile views
  1. The 809 panel clearly recognizes this and strived to make improvements in their report.
  2. It really shouldn't. Best to follow ji20874's advice. Just select the important parts which included performance metrics (if the government hasn't already cast those in stone in the solicitation), and incentives/disincentives, etc. Once the government was selected the winner based on what their proposal includes, the only things that matter are metric standards and what happens for missing or exceeding them. Realistically though, there are lots of other things that need included but those should be carefully picked and added.
  3. I see lots of LPTA used. It’s an easy out for contract people to use - puts burden on program offices to define needs in absolute terms, puts pressure on companies to submit most favorable prices initially, and takes shorter times for award. There’s no need for lengthy technical evaluations, there’s minimal analysis and decision making for contract specialists, no need for detailed cost/price analysis, and best of all, no negotiations involved. It’s a paper pushing exercise. As far as past performance, use of pass/fail generally means all offers pass. In my opinion, effective use of past performance requires diligent research including talking with end users and customers. Everyone now seems to use Whatever comes out of the database. There’s no personal contact to see how well companies performed against similar efforts. When using that type of information, it should be applied and evaluated on a scale basis. i feel too many contracts get awarded using the easist way out.
  4. G Smith never responded with more information so we don’t know. LPTA is overused. I have a hard time visionalizing why it is used under an IDIQ arrangement, especially for services. Considering past performance as a minimum should be factored in.
  5. Funny but I don’t remember much about all the publicity. What I do remember though was his push for thorough cost estimating and analysis, and particularly should cost. It seemed at the time a should cost analysis was required to justify every major expenditure.
  6. Civilian and napolick, I apologize for my misreading this and my responses. I’m wrong in what I said pertaining to this. I’m aware of a current agency procurement and assumed we were talking about it. Now I see they aren’t the same. In the one I was thinking of, the agency published and issued an RFI seeking past performance, methodology and tools used for Agile development. They planned on evaluating using factors and they select a manageable number of sources to consider further. In essence, they had a process that helped decide which GSA Schedule holders to consider further. They next sent an RFQ for pricing and simulateously had companies perform a mini two week scrum type sprint development exercise and evaluated the code produced. They plan on selecting a few for BPA award. I don’t believe the way this is worded they are the same. I read more into this that there is and confused it with another. Sorry about my wrong assumption and bad posting.
  7. An excellent, but time consuming approach, is go through an offers proposal and select the meaningful part. When necessary change or add language so promises and the like are clearly set out. Go through it thoroughly with the contractor and sign the agreement bilaterally.
  8. Civilian, Don’t get hung on on that decision because it’s not the same. In that case the government used a down selection process once proposals were revived. Here we’re looking at a way to consider which GSA Schedule contractor to consider. It’s the agency established FAR 8.4 ordering scheme
  9. The situation isn’t unlike what agencies face with buying IT such as cloud services. That’s because companies have mostly unique ways of packing and pricing their firings. A common approach is require submission of unit pricing and then havecompanies propose against a government defined situation over the contract life using their individual unit prices with the RFP defined assumptions. This allows companies to stick with their standard ways of doing business and propose against a common defined solicitation baseline. The government then carefully analyzes each contractors approach against their pricing. This is tricky because the companies may use completly different approaches that each other.
  10. To make this easy, take a look at exceptions under FAR 5.202. Everything mentioned is covered by this including GSA Schedules under the 16.505 reference. As I mentioned Fedconnect is a commercial business and doesn’t have a website. They sell software and the software links toFedbizzopps.
  11. The NSA site is limited mostly to classified contracts with the intelligence community. Companies have to go through an involved screening process to access the information. It does a good job promoting competition within its classified domain. FAA doesn’t follow the FAR and uses a different process. Fedconnect is done by a private company and is not government. The other two have already been explained. FBO has a specific purpose and the other sites don’t really fall into FBOs arena.
  12. napolik, It's important to not mix case law pertaining to FAR 15 with 8.4. I don't see this as a FAR 15 procurement where price must be considered in reducing the number of offerors like a competitive range determination. It seems to me the agency is describing their process to consider and select Schedule holders for BPA awards. There's no requirement to consider price until agencies get to making their best value selection decisions. This looks like a process to narrow down the consideration of Schedule number to a manageable number
  13. Ninja, good course of action. Protests are lost because of improper evaluations. The CO and others should check everything in the documentation to ensure evaluation consistency and compliance with the RFP statements.
  14. Picking up on the training theme, the more complex acquisitions generally have experienced and knowledgeable PM/CORs. If they aren’t doing SOWs correctly, it’s usually because they haven’t been shown how to do it properly. That’s where examples and personal assistance goes a long way. As I mentioned earlier, agency wide training is good where you get into details of critiquing and showing by example what’s good and what isn’t. Now if you have CORs that can’t catch on and choose to write documents yourself, that works as long as you have the time and energy. Ive found if a widespread problem like this exist, it’s important to let senior management know. Propose fixes as part of the job. Several agencies resorted to centralizing SOW writing type functions where experts do it for a program or larger function. Brining incontractors as SMEs is also another option. But if these contracts are important to the agency, they need proper attention and resources. Having a CS or CO do it doesn’t solve the larger problem.
  15. I also agree with General. If bad SOWs are a wide spread problem, get your management to schedule training. A session with lots of critiquing examples goes a long way
×
×
  • Create New...