Back in the mid-1980s, when I worked at the Software Engineering
Institute, we were supposed to suggest how to improve the software process
for DoD. I didn’t see this, but some of my colleagues found places in the
1980s that were still required to create programs on *punched cards*,
because all their QA procedures were geared to procedures that involved
how the cards were punched, and then how they were collated into the
source, and how they were identified in change logs. These procedures
were *unalterable*; it didn’t matter that Sun was already producing
workstations, that most of the rest of us had been creating programs using
text editors like TECO and EMACS, since the late 1960s; their procedures
had been created in the 1950s and could not be changed at all.
A friend and I once worked at a major manufacturer of located in
. As we were creating 21st-century software, their build procedures
consistent of essentially concatenating card decks; they refused to use
#include in C. Instead, it was something like cat a.h b.h c.h d.h e.c >
program.c and then compiling program.c. What is truly scary is that they
are one of the leading producers of and there are thousands of
s out there whose software is built using 1950s techniques. My
friend’s goal, he stated, was to “drag their software methodology, kicking
and screaming, into the 1960s”.
I spent two years of my life trying to figure out why the most progressive
companies refused to do business with the Department of Defense. Or why
companies like TI used decade-old design software, two or more generates
behind their internal tooling, to build for DoD. It was simple: DoD at
that time required that all the software used be placed “in escrow”. Then
they would put the maintenance of the chip, or when required, the
“second source”, out for bid. A competitor could bid low (lower than
cost, even), but the result was they would get copies of all the software
used by the original creator. So major corporations would not place their
family jewel software in such a situation. When I told DoD “We’ve
idenfitied your problem with software acquisition and software quality:
you acquisition process” they said “But that would require that Congress
pass a new law allowing us to use modenr methodology” to which my reply
was “Well, we’ve identified theproblem; it’s up to you to solve it. If
it’s going to take an Act of Congress, then it would be a good idea to
start that process now. We’ve identified the problem; the solution is up
to you”.
It was also the case that by their requirements of massive design
documents, including such obsolete concepts as “flowcharts” (remember
those?), the acquisition process was so long that quite often by the time
the software was delivered, the machine on which it was supposed to
operate was no longer manufactured.
By comparison, something as straightforward as an electronic election
application being forced to live on something as modern as Win2K is
positively progressive!
Microsoft lives in a dream world in which, upon the latest release of
product X, they think the entire world will change overnight to use the
new version, and there is no attempt to make it possible to maintain these
older products. For that matter, if, five years out, someone actually
decides to move the implementation to the latest version, you can’t even
FIND the documentation for the old calls, so you have no idea what they
did, or what their parameters meant, so you can’t figure out what the new
calls should be. The cost of revalidating massive software systems
(largely because of the massive documentation requirements, established in
the 1950s when programs were a tiny fraction of the size of modern
programs) means that the slightest change is unrealistically expensive.
I know companies that have not moved from Office 2003 because the
retraining cost is far too high. Retraining literally THOUSANDS of
employees to use Word, Excel or PowerPoint effectively with the godawful
ribbon interfaces is completely infeasible. The Microsoft studies seem to
concentrate on how people who have never seen a computer can use the
products; it says nothing about how highly-experienced professionals can
make the transition effectively.
Governments work on 15-year acquisition cycles and 20-year maintenance
cycles. In computer years (or is it dog years?) this means that they are
almost always working on equipment that is prehistoric. What is amazing
is that they don’t require buggy whips for government-acquired cars.
I still saw MS-DOS based products in use when I taught at military bases,
and even scarier, a desire to keep the interfaces “MS-DOS compatible”,
e.g., prompting for input and getting things typed in (I used to teach GUI
programming to these people, and one of the constant questions was “How do
we issue a prompt and read the keyboard response to it?”)
If you think about it, the programming standards are essentially
late-1950s “best practice” and if someone has a new idea, there is an
insurmountable inertia to overcome to change anything. (Look at the Ada
specs: the language has horrible misfeatures because some General once
heard from a friend that thus-and-such a construct was unreliable; I was
involved in the Ada evaluation effort, and the specs were a joke; in many
cases they were mutually incompatible requirments, such as a need for
real-time response and a need for garbage collection)
joe
> Unfortunately we have “certified” systems running on W2K that our
> customers are not allowed to upgrade at all. I know it sounds strange,
> but US federal laws concerning elections-related software is very highly
> scrutinized and once written is nearly impossible to change without huge
> expenses. It wouldn’t be so bad if the coding standards imposed by the
> federal government ddn’t read like something out of a 1970’s-era WORST
> practices manual!
>
> Greg
>
> — xxxxx@storagecraft.com wrote:
>
> From: “Maxim S. Shatskih”
> To: “Windows System Software Devs Interest List”
> Subject: Re:[ntdev] Win8 WDK: no XP support is a showstopping barrier to
> adoption.
> Date: Mon, 19 Sep 2011 05:04:32 +0400
>
>> We need to support Win2K
>
> According to what I know (from professional people who did the serious
> research), w2k was already dead in the US around 3 years ago. Very few
> customers were running it, and even among them, very few of them were
> deploying the new w2k machines.
>
> But in non-US markets the things are different, especially if you will
> take the server SKUs - the next server after w2k was only in 2003.
>
> –
> Maxim S. Shatskih
> Windows DDK MVP
> xxxxx@storagecraft.com
> http://www.storagecraft.com
>
>
> —
> NTDEV is sponsored by OSR
>
> For our schedule of WDF, WDM, debugging and other seminars visit:
> http://www.osr.com/seminars
>
> To unsubscribe, visit the List Server section of OSR Online at
> http://www.osronline.com/page.cfm?name=ListServer
>
>
>
> —
> NTDEV is sponsored by OSR
>
> For our schedule of WDF, WDM, debugging and other seminars visit:
> http://www.osr.com/seminars
>
> To unsubscribe, visit the List Server section of OSR Online at
> http://www.osronline.com/page.cfm?name=ListServer
>