Jump to content
View in the app

A better way to browse. Learn more.

The Wifcon Forums and Blogs - 27 Years Online

A full-screen app on your home screen with push notifications, badges and more.

To install this app on iOS and iPadOS
  1. Tap the Share icon in Safari
  2. Scroll the menu and tap Add to Home Screen.
  3. Tap Add in the top-right corner.
To install this app on Android
  1. Tap the 3-dot menu (⋮) in the top-right corner of the browser.
  2. Tap Add to Home screen or Install app.
  3. Confirm by tapping Install.

Matthew Fleharty

Members
  • Joined

  • Last visited

  1. Bingo - there is a difference between wanting to understand various approaches during market research and evaluating approaches (which are usually not promises) during source selection. @Scrutor Had your story occurred during a source selection (not some measly purchase order) and the one offeror wrote that as their approach, received the contract because the evaluators liked what they read, and then started to perform without using the ground protection mats (let's say because using the mats would have cost more money and a dollar saved is a dollar earned for the company), what then?
  2. I don’t contest that this is the norm - that said, are our R&D projects generally successful (choose your measure(s))? That’s worth thinking about because, if they’re not, well we all know what they say the definition of insanity is…
  3. Thanks everyone who has contributed to the discussion. I’m working on an article that critiques the common practice of requesting approaches so I’ll have a more fulsome response for your thoughts/feedback soon. While I rarely say always or never, I have not seen/encountered a situation in my professional career that would have benefited from evaluating approaches in competitive acquisitions - quite the contrary, I’ve only seen the practice make situations worse. I strongly disagree with this position - if most acquisition “professionals” lack competency, most customers are even more ill informed about how to properly structure a competitive acquisition’s decision making process. In almost all cases, they’re insistent on doing something because that’s what they did last time which certainly doesn’t make it right. Vern once told me (or wrote) contracting officers don’t have “customers” we have “clients” and unlike the adage “the customer is always right,” part of doing professional work is being able to tell a client “no.”
  4. Without details of what? We can evaluate details of the offer and the offeror. Is it necessary or even beneficial to evaluate details of their approach? If so, what's the strongest argument(s) for this assertion?
  5. @Don Mansfield Towards the end of the podcast you state you're looking for a word to describe the FAR Part 15 "competitive" negotiations process - I think the appropriate word is "begging" (e.g. we send evaluation notices asking offerors to improve/revise an aspect of their proposal/offer) which contrasts nicely to what we should be doing, and as Vern notes we do under the A&E process, which is "bargaining."
  6. Thanks for the prompt Vern - it does make sense to get on the same page first. In this context, "approach" means "information in a proposal describing how a contractor may do something." I'll not that in most cases this information is not an offer/promise. In this context, for "understanding" I like the framework of Bloom's Taxonomy so understanding means "ability to demonstrate comprehension of something by explaining ideas or concepts."
  7. @formerfed @FrankJon & any others: I’m genuinely curious and want to continue the discussion, but not further derail Vern’s thread so: What’s your best argument(s) for why the government should evaluate “approaches” or “understanding”?
  8. In almost all cases an offeror’s approach is not a promise/offer - assuming the approach is not a promise, do you still believe the government should evaluate approaches?
  9. I would not consider an offeror's approach as an evaluation factor. Why is there such an obsession with evaluating offerors' approaches? As for "risk" in a source selection, how do you measure it? And, if you try to, how is it distinct from what has already been evaluated (the offer and the offeror's capabilities) as to not create double counting? Moreover, the "risk" might not be attributable to the offer/offeror at all - it may simply be the product of the government's own requirements (consider the risk in recapitalizing the country's intercontinental ballistic missiles...) at which point why waste the time.
  10. I should have been more clear - I don't take exception to evaluating past performance or experience (both fit within the definition I provided as attributes of an offeror). I do however take exception with evaluating things like "understanding" and "risk" as I think they produce noise. I think both current and future part 15s are wrong for stating that. I'm curious what your difference response would be.
  11. Here's my concern with applying AI to government acquisitions - it's drawing from what we've already done, it is not creating anything new/novel. And what we've done/been doing over the past two decades isn't exactly great. The result is that AI is going to spit out the same or similar products and we're going to get the same or similar results. Is that what we need right now? To get to mediocrity (or worse) faster? I don't think so, but I'm just speaking for myself.
  12. I disagree as to whether that's proper. When you say "we" if you're talking about the acquisition workforce you're correct in that for some absurd reason many people usually consider aspects that go beyond the offer through insane essay writing contests. This is NOT a good thing. What does it mean to evaluate their "understanding" particularly when proposals are written by capture teams (or today it could even be AI) and then, once awarded, tossed over the fence for a separate team to perform? What "value" does that provide? Vern has written extensively critiquing the all too common practice of evaluating things/proposals that do not provide value to the government. See the following: 1. “A Primer on Source Selection Planning: Evaluation Factors and Rating Methods” by Vern Edwards 2. “Contracting Process Inertia: The Enduring Appeal of The Essay-Writing Contest” by Vern Edwards, Addendum by Ralph C. Nash 3. “Streamlining Source Selection by Improving the Quality of Evaluation Factors” by Vern Edwards, Addendum by Ralph C. Nash Nate Silver wrote in his book The Signal and the Noise "Information is no longer a scare commodity; we have more of it than we know what to do with. But relatively little of it is useful. We perceive it selectively, subjectively, and without much self-regard for the distortions that this causes. We think we want information when what we really want is knowledge." (emphasis added) We can do better - let's not regurgitate the poor source selection practices that have built up over time having been copy and pasted ad infinitum to deliver acquisition outcomes that no sane person would consider respectable.
  13. I would define value as: worth assigned to an attribute of an offer or offeror

Account

Navigation

Search

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.