Careers: Interviews
Best-selling Author and Top-ranking A+ Authority...
This week, Stephen
Ibaraki, I.S.P., has an exclusive interview with the internationally
regarded A+ certification expert, Craig Landes.
Craig Landes has been
working in the information systems industry for more than twenty
years, following a career in the entertainment world. Starting in
healthcare information systems, he designed database management
systems, training seminars, and systems integration solutions. He
then left the healthcare industry to become a consultant, providing
liaison services between corporate executives and systems
developers. Over the years, he continued to pursue a lifelong
interest in creative and technical writing, publishing many articles
and columns for various computer magazines and newsletters.
In 1998, Craig, together
with James G. Jones, wrote their first concise review, or "cram"
book, designed to help candidates pass the CompTIA A+ certification
exam. The exam, divided into both a hardware and operating system
module, is a comprehensive knowledge assessment for first-tier tech
support personnel in the PC and Windows arena of information
technology. The Exam Cram book gathered everything about PCs and
operating systems into a single source.
Over the ensuing years,
"A+ Exam Cram" has become a best-selling review book, with over
150,000 copies sold. In the fourth quarter of 2003, CompTIA released
a periodic revision to the exam. The subsequent revision of the “A+
Exam Cram 2” title is receiving considerable attention—continuing
the “best-selling” tradition of the previous editions. The book
covers everything having to do with PC hardware and Windows
operating systems, from DOS through Windows XP.
Craig continues to write,
presently developing an innovative new approach to teaching MS
Office applications. His upcoming book teaches the practical
application of spreadsheets, with a focus on Microsoft Excel. Future
projects include similar books for MS Word and PowerPoint, a
fictional novel, and a self-help guide to life for teenagers.
Discussion:
Q: Craig, you are a
well-respected expert in information systems and A+ certification.
Thank you for taking the time out of your demanding schedule to
speak with us.
A: Thank you for your
interest in the book.
Q: Describe your days in
the entertainment field. How has it helped you in your career in
computing?
A: Well, I started out—I
guess like most kids—playing in bands for fun, and to pick up girls.
I’m a keyboard player, and back in the 60s, electric organs were
just starting to become popular. Keyboard players tend to end up,
many times, being the sort of musical director, so it wasn’t all
that long before I started putting together and running my own
bands. I remember the first time I had to fire someone, and I went
through all the psychological anxieties you see in movies and
stories. But I came to understand many of the management principles
of any business, through running those bands. I remember my father
once telling me that either I could be everyone’s friend, or I could
manage the band. I never forgot that. It taught me how to make a
distinction between being a professional as opposed to being just
someone playing music every so often.
The second thing I
learned was based on the immediacy of the entertainment world. In
corporate business, whatever anyone does tends, for the most part,
to take a fairly long time before there are consequences. Playing in
front of an audience has immediate consequences. If the audience
doesn't like what you’re doing, they leave. That taught me a whole
lot about the marketplace, products, advertising, and so on. Not
only that, but working with musicians is very different from working
with traditional business employees. When you fire someone in the
regular workplace, there's a level of restraint. But when you fire a
musician, nobody knows what sort of reaction might take place.
Finally, music works on
many levels. For instance, there's a melody line and at least three
or four accompaniment lines. Each of those "threads" is managed by a
completely independent person and mind. Underneath that, there're
also things like the rhythm track, the time signature (like a
motherboard clock), and the complexities of other aspects of music
theory. Everything in a song has to come together perfectly. You
can't have the bass player end the song at a different time than the
guitar player, for instance. To manage the multiple lines and
threads of even a simple blues song is a lot more complex than
making sure a report ends up on a manager's desk by closing time on
a given day. Eventually I realized a sort of rule of thumb, where
managing one musician is about the same level of complexity as
managing four traditional employees in the workplace.
Essentially, playing a
synthesizer or playing a computer amounts to the same thing. They're
both electronic gadgets, but one puts out music and the other puts
out words and pictures. Either way, the audience (or reader)
responds to the output in many dimensions. It not only has to be
recognizable, but must also be pleasing to the mind (or the ear).
Then it has to be something someone can respond to on an emotional
level, and also something someone can imagine, or visualize. I think
without having been on stage, I'd never have really understood the
interaction between someone creating something, and the people who
want to respond to that creation. I never have really understood
that a computer is only a machine—like a synthesizer, organ, or
piano—and it isn't the technology that matters. It's what someone
can do with that technology.
Q: What factors triggered
your interest in computing?
A: Mostly, it was my
being disheartened with the music business. CDs had come out, and
live music was increasingly more difficult to play for an audience
accustomed to the perfection of the final, recorded product. I was
tired of being in a business where fewer people had the basic
education to understand the difference in quality between various
types of music. I wanted to be someplace where "good and bad,"
"working and broken," were self-evident. I hadn't really thought
about computers, but I needed a job, so I got into the office
temping business. Obviously, I had to learn something about
computers right away. It turned out I had an immediate and intuitive
understanding of DOS. It was never a mystery to me, probably because
of my history with synthesizers, and I just understood DOS right
from the beginning. It took me awhile to understand that most people
don't understand operating systems that easily.
Q: Profile three key
projects from your days in healthcare information systems, database
management systems, and designing system integration solutions. What
lessons can you pass onto our audience?
A: I think the first
thing to understand is that I didn't have what people would call a
formal education in computer technology. As I've said, understanding
DOS was, for me, intuitive. As a result, I was able to pick up other
software very much like "playing by ear." I quickly became very
proficient at whatever were the applications that had the biggest
market-share: MultiMate, WordPerfect, Quattro, or whatever. When I
started at a local hospital, it was to help with a crisis in
marketing. A competing hospital was trying to take over an area of
the county.
One of the biggest
problems in that crisis had to do with the hospital's minicomputer.
I think they were using a VAX (Digital Equipment Corp. computer
system). My boss couldn't get any useful information from the IT
department, and was going crazy with that. Secondly, they were using
an expensive industry-specific database system, designed to quickly
manipulate marketing statistics. However, my boss couldn't get the
information from the hospital Accounting department. I learned how
that database worked, in general, and saw the problem.
At the time, "PC File"
was the first shareware application to come out. There's a lot of
history there, but the bottom line is that it was a $30 database
that was very easy to use. All I really needed was either a
comma-delimited, or fixed-space ASCII file. I could get a "print
report" out of the VAX, so I asked the IT department to run a query
on their master database, for the fields my boss wanted. Then I
found another very elegant program ("Monarch") that specialized in
taking a print file (PRN file) and parsing it into individual
fields, in a dBase format (.DBF files).
When I'd downloaded the
print file, then parsed it, I was able to import it into PC File and
have a useful database. I told my boss about it, and she wondered if
it'd be possible to get a report for the entire patient history,
based on ZIP codes. Twenty minutes later, I'd produced that report,
sorted on ZIP codes. She was stunned, given that nobody had been
able to do anything like that. I was then able to import the same
dBase file into her expensive application, and from there, we were
in business. Competition, after all, mostly rests on information and
rapid access to that information, and the entire healthcare industry
was mostly crippled with very expensive, too complex, unwieldy,
proprietary systems and applications.
My boss was very
concerned about how much it would cost for us to buy the database I
was using. When I told her it was thirty dollars, she about fell
into a coma. Fortunately, we had emergency medical attention readily
at hand. When she recovered, she decided whatever it was I was
doing, was magic, and pretty much gave me a free hand.
I did the same thing with
Symantec's "Q&A," another DOS-based database that to this day,
remains one of the most sophisticated PC databases ever invented.
(Q&A eventually fell into disuse, but continues to hold a very large
international user community. A 32-bit Windows replacement was
finally developed and released by a company called Lantica,
www.lantica.com.) Anyway, I'd worked with some consultants I knew,
to completely restructure the physician credentialing system. That's
where a hospital periodically reviews any doctor's credentials so as
to grant them privileges to practice medicine in that hospital.
Prior to the Q&A system, it was taking a 5-person department about
two months to run the annual credential process.
I developed a proposal
for a new system, based on Q&A, which would likely cut the entire
process down to about a day and a half. When I met with the IT
department, Marketing, and Physician Credentialling departments,
they were very interested and wondered how much it would all cost.
Understand that a typical healthcare information system often starts
at around $100,000. The hospital was considering an upgrade to their
main system, which would cost somewhere in the $1-million range.
The consultants had made
their cost analysis, and it would be around $8,000. That included
all licensed copies of the $200 database, installing a small
Ethernet network in the department, and the labor to build the
database. There was a very long silence as all the head honchos
pondered a number that amounted to one hour's work in the surgical
suites. Finally, although the IT people were deeply skeptical, the
CEO and my boss decided to give it the go-ahead. A month later, the
Q&A system was in place—on time, on budget, with several included
change-requests. It worked perfectly. That month, the credentially
process happened so fast that the department had a new problem:
trying to figure out what to do with the now "extra" two months
worth of time.
I was considered a master
magician, and soon was given the job of heading up the information
systems for an innovative new healthcare project—a subsidiary
company of the hospital. Because healthcare technology is at least
ten years behind the rest of the world, I mostly just went to the
library, whenever I encountered a "new" problem, and looked up old
copies of PC Magazine to see how the rest of the world had long-ago
solved that same problem. Very quickly, the subsidiary company,
charged with taking on management of the global hospital financials,
began doing just that.
I had installed
Artisoft's LANtastic, as the main network, and it never went down;
never had problems; never interfered with business; and practically
had no network management issues. Our subsidiary was like the
British navy in opposition to the Spanish armada. The main hospital
IT department (the Spanish, in the analogy) was so big and slow,
they couldn't manage even a fraction of their data. We, on the other
hand, using PCs and PC-based applications, routinely out-performed
anything the hospital departments could do, with more accuracy,
faster turnaround time, and new types of organized information
nobody in the healthcare industry was using or able to generate.
Politics eventually shut
down the entire subsidiary. The Finance department, IT people, and
even the CEO couldn't afford to have a small subsidiary group
demonstrate every few days how badly incompetent they were. Rather
than change the overall philosophy of information management, they
chose to shut down the company and sweep the results under the
carpet. It's one of the reasons healthcare in America is so
expensive—obsolete technology, politics, ignorance, and the
incapability of most healthcare administrators to understand modern
technology.
Q: You have a varied
background with many successes. Can you provide two stories with a
humorous slant?
A: Well, I eventually
left the hospital, having built yet another very small, very fast,
and very inexpensive Q&A database [for] some of my people. I went
into consulting for one of the third-party insurance payers. About
six months later, I got a call from a friend of mine at the
hospital, asking if I'd come in to talk about a possible consulting
project. They were still using my "simple" database, and nobody
could figure out how to reproduce it. So I put on a suit, and went
over to talk with my friend. I couldn't have been in the building
more than about half an hour before our meeting was interrupted with
a phone call.
It turned out to be a
call from Security, telling my friend—now the head of Marketing—that
I was to be escorted out of the building. I had visions of
announcements going throughout the public address system, "Craig
Landes has entered the premises. Danger, danger! Please evacuate the
building! Craig Landes is on the premises!"
My friend was very
embarrassed, but couldn't do much about it. It turned out that the
head of IT had found out I was having this meeting. The head of the
department was still furious about how within 1 year, our little
subsidiary had almost taken over complete management of the entire
hospital corporate structure. That just wouldn't do, and there was
no way they'd allow me any further access at all to the business.
After all, careers were on the line. I laughed; shook hands with my
friend, and left. If you want to know why your healthcare costs
continue to skyrocket, first consider Federal regulations and lack
of competition. But secondly, understand the unbelievable
bureaucracy and egotism of the entire administrative side of the
business.
The other thing I think
has a lot of humor potential, has to do with my working for Arthur
Andersen. Remember them? They used to be one of the Big Six
accounting firms in the world. Anyway, I was hired in as a temp,
working in their publishing division to put together the many
training manuals about things like mergers and acquisitions, legal
and tax issues, and so on. Everyone was always having a crisis
because there was never enough time to get everything done by some
deadline.
I took time, up front, to
begin developing a couple of macros in Word, along with some
customization of the toolbars in all the MS Office products. Then I
started flying through my parts in the projects. Very soon, I was
getting about five times as much work done as any of the full-time
employees. I reconfigured Windows, added in some freeware products,
and generally fine-tuned my computer so as to get the work done. At
one point, when I had some down time, I even changed the Windows
"Start" button to say "Craig." None of this impacted their network
at all, and was all simple configuration changes to the local PC.
I never thought about
politics or personalities, mostly assuming that it was a good thing
for such a large corporation to get as much work done in the least
amount of time possible. Sure enough, I was soon moved out of the
company, for showing up all the high-priced Andersen MBAs. A week or
so later, the IT department called, and asked that I come in for an
hour, to "put my computer back to the way it was supposed to be."
Apparently, nobody could figure out how I'd changed the "Start"
button. As a result, they started telling the division managers that
I'd "reprogrammed" Microsoft Word in some way, and that's why it
looked as if I was able to do that much more work than any of the
full-time employees. They wanted me to "un-program" the Office
products.
If it weren't so
pathetic, I would've laughed. The problem is that all I ever did was
read the reference manuals, configured each application to an
optimal setting for whatever work was required, and used completely
standard, built-in features and capabilities. However, I learned
that the vast majority of IT people don't have time to learn any
applications; they're too busy installing hardware. I also learned
that most employees have no idea how to use any of their
applications; they're too busy handling make-believe crises. Large
corporations have probably reached the end of the line, and small
businesses with a Web presence and e-commerce will probably wipe
them out in the not-too-distant future. All because of technology,
and the fact that small businesses don't use committees for
anything.
Q: Can you provide
additional details about A+ certification?
A: A+ certification is
really a great idea. A computer, after all, has become probably one
of the most powerful and important tools for doing business since
the credit card was invented. The problem is that lots of young
people grow up with whatever current technology and software happens
to be available at home. Most of today's technicians developed an
interest in computers through playing games, and wanting their
machine to be faster, hotter, and more powerful than their friends'
machines. They don't really understand the underlying technology,
but they know how to tinker with their own machine to boost its
performance.
It's not all that
different from how kids used to mess with old cars, back in the
1950s. They'd use all sort of risky, and even illegal add-ons, to
build a hot rod and do street racing. That's all well and good for
your own machine, but it isn't at all a good idea when you then move
on to be a technician for a large business. When you're working as a
mechanic, to follow the analog, you're dealing with family cars that
carry husbands and wives, and children. Although you might
personally know how to make the car a whole lot faster, what's the
point if someone doesn't know how to drive it? That, and people can
get killed.
To follow the analogy
even farther, it doesn't help to make a single machine a super-duper
machine, when the whole point of having the computer in a
multibillion-dollar business, is to keep accurate business records.
There has to be a way to first discover those technicians who know
how to hot-wire a computer, but don't know how to keep it safe. Then
there must also be a way to teach those extremely talented
technicians how to apply their skills in a more conservative
fashion. A+ certification primarily brings the upcoming "Top Guns"
of computer technology, back down to earth, so to speak.
Secondly, the
certification helps produce a set of standards in terms of what a PC
technician ought to know. Although it's true that DOS isn't the
basis for Windows XP, so what? There are still a tremendous number
of legacy applications in the world, all based on DOS, 16-bit
Windows, Windows 9x, and older versions of databases and
spreadsheets. Those applications are absolutely critical to the
banks, corporations, industries, schools, or other institutions that
use them. A+ certification goes a long way to making sure that a new
technician won't accidentally wreck a system that's controlling
hundreds of thousands (if not millions) of dollars. Certification
helps put some guidelines in place, and puts us all on the same
page, so to speak.
Q: How will the A+
certification evolve in the future?
A: Unfortunately, I think
A+ certification may be coming to a limit point. It used to be that
a technician could visualize a computer at the other end of a phone
conversation. Windows 9x changed that, introducing so much
customization that each PC has become almost unique, unless it's
just an out-of-the-box machine. Windows XP, wide-area networking,
security, and globalization of the economy have made PC management
far more sophisticated than it was in the past. Again, with the car
analogy, it used to be that some kid could work on a car in his or
her own garage. Nowadays, with all the microprocessors built into a
typical car, that's almost impossible.
A+ certification begins
with the assumption that there's a "standard" kind of PC, and a
"basic" operating system. That's not really true anymore. It's one
of the reasons our book went from 450 pages, originally, to 800
pages today. I don't know how much longer a group like CompTIA will
be able to put together a test of some kind, that will adequately
cover the concept of "basic" knowledge about PCs, operating systems,
and simple networking.
Q: There are so many
books available on A+ certification. How do you differentiate your
book from others in the market?
A: Jim and I looked at
many of the books on the market. First, you should know that CompTIA
publishes what they call "objectives" for any given exam. Those
objectives are the specifics of what the candidate can be tested on.
Most of the books about A+ certification take a list of all the
objectives, then go down that list and produce a very short summary
of what they mean. In other words, most A+ preparation books make an
assumption that the reader is highly experienced with PCs and
Windows, and only needs to know how to answer whatever questions the
author(s) thinks might be on the exam.
When we first thought
about our own book, we realized that nobody is discussing the
conceptual foundation of computer technology. I think it's because
nobody really has a grasp of those concepts. It's true that
individual people know a whole lot about individual components of a
system, but there isn't anyone putting it all together in one place.
For instance, we all "know" that a computer can make a True/False
decision. And we know it happens using electricity and transistors.
But how does a computer know that 2 is greater than 1, or that 23 is
less than 85?
I've had a life-long
interest in both philosophy and psychology, along with a fascination
with biology and medicine. At one point I'd thought of getting into
either the law, or neurology, and although I decided against it,
I've kept a strong interest in how the mind works. The philosophy
has to do with how does our mind understand information and reality.
By the time I got around to writing a book, I'd developed a
complicated and integrated system of understanding how knowledge
happens, how we store information, and how we retrieve that
information. I saw that none of the books on the market teach
technology, using any of those principles.
What Jim and I decided to
do was to take all of Jim's technical background and history, and
combine it with my view of how people learn. Then we took both of
our backgrounds in how tests work, and started writing a book
designed to do two things. The first was to build a visual image of
how a pulse of electricity forms somewhere in a system and
eventually becomes useful data. The second was to bring to light the
psychological principles involved in handling questions on an exam
of any kind.
One of the things I
really like about this latest revision, is that Que gave us enough
room that we could include a lot more of the psychological reasoning
a person can use to figure out a correct response—even if they can't
quite remember the underlying information. None of the books I've
seen, do this. They all expect the reader to rely on rote memory: to
remember a whole pile of facts. We wanted to produce a book that
readers could hang onto, after the exam, as a consolidated reference
to "how things work."
I guess what I'm saying,
is that although the ears hear and the eyes read words, our brain
stores information in pictures. It takes about seven days for new
information to enter our long-term memory. The reason for the delay
is that our brain has to convert sensory information into a picture,
or image of some kind. Then that image has to add in the additional
dimensions of remembered sound, smell, taste, and the associated
emotions. Although rote memory might get someone through an exam,
it'll only happen if they have a very good short-term memory, to
store a lot of words.
Our book, to the best of
our ability, anyway, is based on the idea of building images. We try
to create an integrated image, or concept of hardware, operating
systems, and the various CompTIA objectives, so that the reader
doesn't have to use short-term memory to pass the exam. We try to
stimulate the long-term memory instead, which is a whole lot easier
to use in terms of information retrieval. Plus, the information we
explain stays with the reader, long after they've passed the exam.
Q: Please share ten tips
about A+ certification from your book.
A: 1) Above all, read
each question carefully! Many of them use convoluted language to try
and mess you up.
2) Secondly, read each
question again, backwards, if possible.
3) Third; take a look at
the possible responses, to get a sense of what you're being asked.
4) Finally, read each
question again.
5) Don't worry about all
the unbelievably complicated numbers and speed ratings. In very few
questions will you be asked the throughput rating difference between
SCSI and Ultra SCSI. However, the widely-known ratings are
important. For instance, the three USB throughputs.
6) Don't get lost in the
real world, and what's going on in the latest, greatest computer.
CompTIA is a few years behind the technology curve.
7) Don't over-analyze the
questions. Always remember that the exam is looking for standardized
knowledge—what "everyone" ought to know.
8) Plenty of books get
into the massive body of knowledge about hardware. But don't forget
that the operating system module is half the exam.
9) Learn DOS! At least
the most basic concepts. The Windows XP Recovery Console is
basically the same as DOS, with most of the same commands.
10) The Exam Cram series
is filled with "Exam Alerts," which are as close as we can legally
go to telling you specific questions that are definitely going to be
on the exam.
Q: Describe how hardware,
operating systems, and software will evolve over the next five
years.
A: I had a problem
understanding why CPUs were becoming as powerful as they are. Then I
read some white papers from Intel, talking about how one objective
for the business community is real-time language translation. In
other words, we want a system where someone in New York can pick up
the phone to Hong Kong, and speak in English. The "phone" will
automatically translate to Chinese, and an electronic voice will
produce the conversation at the other end of the connection in real
time, with intonation and nuances.
Computers are too hard.
They're ridiculous, when you compare them to MIDI and the music
industry. Long ago, the music industry realized that hardware is
inconsequential to the musicians. Nobody…nobody at all…cares what
goes into a synthesizer. All that matters is the music, and making
cool sounds. Whatever someone has to learn, they want to port that
knowledge over to newer and more expensive instruments. As a result,
I can spend a long time learning one synthesizer, then use the MIDI
interface to play almost any other synthesizer on the market. I'm
not interested in how the synthesizer works; I just want to play the
music I hear in my head.
Computers have gone in
the opposite direction. The researchers, inventors, manufacturers,
and software developers have gotten totally lost in the beauty of
machinery. Nobody cares. That's why we've reached a saturation point
in computer sales. Nobody has any interest at all in how their car
works. They just want to get in and drive somewhere. As long as it
starts and moves, it works. Until technical developers understand
that simple concept, computers and applications will be frozen in
the limbo of "too much technology."
PCs have become commodity
items. Nobody really wants to, or cares about repairing a computer.
If it really breaks, they just buy a new one. Electronic information
isn't at all that easy to work with, since it requires an electrical
power source. That's why we all of us have pieces of paper all over
the place. Paper stores information without having to use a battery
or a wall socket. It's a lot easier for me to look in a paper
notebook for a phone number, than it is to go to the bother of
turning on the computer.
That's another one of my
rants, by the way. I had a small PDA that used AA batteries. If the
batteries die, I can go to just about any 24-hour drugstore,
anywhere on the planet, and get replacement batteries. Now, for some
unknown reason, PDAs tend to only use rechargeable batteries. If my
unit runs down, I now have to have a charging cradle, cables, and a
wall socket. What if I'm out camping? Or on a boat? What conceivable
reason is there to only use rechargeable batteries for the power
source? I'll bet it has to do with how cool the technology seems,
and marketing costs. But the bottom line is I won't rely on a PDA
anymore. I go too many places where a wall socket isn't convenient.
So there goes one lost sale. Multiply that by all the other people
in America who think about the real world, and you'll see why the
PDA industry sort of "dried up" for "no apparent reason."
In spite of Microsoft's
belief that "everyone" will have an always-on Internet connection,
they're lost in a technological delusion. Perhaps with "instant-on"
power systems and wireless connectivity, that might happen some day,
but not before then. I think the next "big thing" we're waiting for
is a robotic interface with machines, like science fiction has
understood for years. Voice recognition and visual pattern
recognition are the two main things holding up computer technology.
Of course that has nothing to do with A+ certification, but again,
nobody really wants to learn how to repair non-standard machines.
It'd be like each car in every garage being configured and built in
a completely different way.
I think Microsoft has
come to the end of the line. Their need to continually increase
profits has led to schemes for charging people money for their own
information. Longhorn (the next XP) looks like it'll be so tied to
the Web that people won't even store their files on their own
machines. Instead, they'll store "shortcuts" or links to those
files, and keep the actual files on a Microsoft site, where they'll
lease space on a server by the month. IBM tried that with hardware,
and it failed. I think Linux or some other open system is the wave
of the future. "Give away the razors, but charge people for the
replacement blades." That's an old marketing strategy, and Linux
tends to follow that principle.
An operating system is a
one-time purchase. So give it away. Yes, it takes creative effort
and personal ingenuity to develop that operating system, but that's
not where the money is. In the same way, writing a song is a
personal event. However, I make money on the song through the
distribution process, and with live concerts. Linux is "out there."
But making it easy to use is a distribution event. So are the
applications and interfaces. Either way, today is all about the
applications and operating systems, and they're not where the money
is.
Microsoft is right,
trying to make money from the information itself. But nobody's going
to go for their scheme of renting storage space on an Internet
server. Instead, I think the next version(s) of XP will probably be
the end of the line for Microsoft. Already too many people are fed
up with the "activation" feature, locking a machine (and even
software) to a single box. Symantec and Intuit think that's the
future, and I think they'll go bust with it. Again, the MIDI
interface and music industry understand that open architecture and
portability are the driving forces; not proprietary, and frozen,
static systems.
What if you could have
all the energy, food, clothing, and merchandise you wanted for free?
Then what would you do for the rest of your life? Right now,
computers are far too caught up in the "Oh Wow!" of gizmos,
technology, and glittering shiny things. Nobody cares. What people
want is a way to email their friends, IM their friends, watch
someone take off their clothes via WebCam, and shop without having
to stand in line. E-commerce, Internet gaming, the sex industry, and
streamed video are already driving the industry. Whichever companies
or industries are moving in those directions, that's where the
machines and operating systems will go.
Q: Share your top ten
certification study tips.
A: These all assume
you've decided to do a self-study system, rather than taking an
instructor-led course.
1) Read only enough pages
that you don't find yourself having to go back and re-read
something. If you're re-reading a paragraph, you're too tired. Stop.
2) After you've read
about a topic, or maybe even a chapter, go do something completely
different. Take a shower, wash the dishes, go fishing, clean
something. It'll keep your body busy, but allow your mind to go back
and visualize, then integrate what you've studied.
3) Don't have any other
information flow while you're studying. No matter how expert you may
be with computers, they're mostly numbers. We don't know how to
store number information in our heads, only pictures. Any other
information, like a radio, music, TV, or people doing things around
you, will distort the incoming information flow, and you won't
remember it well.
4) Analyze all practice
questions you have access to. Always remember that knowing something
is one thing, but answering questions is usually a completely
different idea.
5) Don't try to learn how
to pass the exam in a week. It's way too complicated, confusing, and
the questions are designed to mess with your head.
6) If you're a Mac user,
definitely take an instructor-led, classroom course. You absolutely
must have a lot of hands-on experience with PCs and Windows.
7) A+ certification is
not for entry-level computer technicians! It's for first-tier tech
support. It isn't about beginners, but rather, for intermediate
users.
8) Always spend time in a
text-based, command-line environment. There are way too many
questions about DOS for you to give them up because you don't know
what a DIR or ATTRIB command is.
9) Try to avoid using the
mouse for at least a half an hour, each day you use your computer,
before taking the exam. It'll force you to pay highly-focused
attention to the navigational pathways to fundamental areas of
Windows. Whether you need to or don't, each time you sit down at the
computer, while you're in your study phase, use keyboard menu
selections to go into the Control Panel, Device Manager, and Display
Properties, along with Accessories, Printer Configuration, and
Taskbar management. Do it at least once, and go around each area as
if you were trying to discover someone else's computer's
configuration.
10) Open up a PC and
remove all the attached devices. Re-attach them, then format the
hard drive and reinstall the machine.
Q: Please provide more
details about your new books on MS Office applications.
A: I've asked everyone I
encounter, "Do you think you know Excel?" Without fail, almost
everyone says they don't know it, but they wish they did know it
better. "Why?" I ask them. They don't know, but they think they
ought to know it better. So I asked myself why it's so hard to learn
Excel. It turns out that everyone thinks spreadsheets are about
numbers. But in fact, they're not.
Excel (my topic only
because it has the leading market share) is actually seven extremely
sophisticated concepts, intertwined with each other. As silly as it
may sound, I could make a very good case that Excel is as complex as
an integrated philosophy of reality. When I began approaching my own
expertise in Excel from the philosophic perspective, I "suddenly"
realized not only how sophisticated is the program, but the fact
that nobody is explaining any of it from that perspective.
I was building a very
complex spreadsheet for Arthur Andersen (as it sank beneath the
waves), and I needed one single piece of help on a particular
function. So I stopped in at Borders on my way home from work, and
sat down on the floor as I went through all 30 books they had on
Excel. What I found was that every book on the market pretty much
prints out the online Help, and rephrases each topic in some nominal
fashion. Not only didn't I find what I needed, but I also found that
not a single book was something I would buy if I wanted to learn the
program.
I believe people need to
learn Excel in two simultaneous ways. First, they need help from the
absolute ground up, all the way down to "remembering" the difference
between a positive and negative number. For example, we never
subtract anything. Instead, we add a negative number to a positive
number. People need help understanding even that basic concept.
Secondly, they want a very fast way to learn how to actually
accomplish a specific task.
My book on Excel is based
on what a temp needs to know in the office environment. They have to
hit the ground running, so to speak, and very quickly figure out how
to do lots of things they don't know how to do. To accomplish that,
the book should also be "task-oriented," in that it lists the most
common tasks in most real-world situations. All the years I spent
temping have given me a comprehensive background of what make up
those tasks. And very few of them have to do with the standard sales
and inventory examples you'll find in every other book about Excel.
The same concepts apply to Word and PowerPoint, and there isn't a
single book I've found that address what a real person needs in
order to solve real problems.
Q: Your book for
teenagers sounds compelling. Can you share more details?
A: That's one of my real
passions. It began with my niece, who's now 23. She was having
serious problems in her life, when she was 16. Soon after, she ran
away from home, abandoning her son. Everyone in the family wanted to
cut her loose, but since I was a thousand miles away, I said I'd
stand for her. I'd be her support system, and her point of reference
to reality. Over the next year, she would call me (collect) when she
had another crisis, and I'd help her understand what was going on
and how to get through it. I helped with money, too, but more often
with advice, understanding, and emotional support.
At one point, she said,
"You should write these conversations down," and that gave me the
idea for "Phone Calls with Cassie." Many people are writing about
the collapse of our society, and the moral decline taking place. But
although there's lots of anecdotal evidence, nobody seems to be
discussing why all this is happening. Our kids are taking the hit.
Today's teenagers have no philosophic education at all. As a result,
they've lost the ability to do critical thinking, and have no
"authorities" to help them with the many crises they go through.
Those problems are very complex, and far more dangerous today than
they were even thirty years ago.
I've applied my
philosophy and psychological help to not only my niece, but to many
of the teens and kids I meet. In every case, the one thing that
really helps is for them to know that there are judgements! There is
an objective right and wrong, and we not only can, but must make
moral judgements and decisions every day. Today's adult society has
almost completely abdicated all responsibility for making those
judgements, and for teaching our children how to reason out analytic
and emotional problems. It's one reason why Dr. Laura or Dr. Phil
have such popularity.
"Phone Calls with Cassie"
is something like that book, "Hello God? It's Me, Margaret," by Judy
Blume. It goes further into the philosophic principles of navigating
life, but at about an 8th-grade reading level. It's like a reference
manual for life, based on principles of philosophy that have nothing
to do with personal opinion, or some moral code developed by
individual societies. Rather, it's about global principles that
apply to all human beings, explained in terms of a sort of composite
teenager, based on what my niece went through.
Q: What are the ten most
compelling issues facing technology professionals today and in the
future? How can they be resolved?
A: 1) Lack of
standardization in all things PC: That's the thing that's really
hurting the IT business.
2) An almost complete
failure to understand the needs of the end-user. Remember that IBM
commercial, where the young kid is building a Web site? He sits next
to an older, gray-haired guy, talking about moving this, animated
that, flames and fire. The old guy has glazed eyes, but listens.
Finally, at the end of the commercial, the old guy says, "That's all
well and good. But I'd like a way to subtract an item from my
inventory whenever a customer buys that item." The kid pauses, gets
a nervous look, and responds that he doesn't know how to do that.
3) I was onstage for 20
years, and thought that the quality of the music, the sound of the
speakers, and the "flash" of technique, mattered. When I left the
business, I became part of the audience. Only then did I understand
that music is "background" to most people. Only the singer and the
words matter, excepting in certain special cases. IT professionals
are like me, onstage: they're so caught up in how cool the latest
feature and hardware are, they've completely lost sight of the fact
that technology and machines are "background" to everyone else.
4) I think every
applications developer should have to first work in a real-world
business environment, trying to solve whatever problem it is they
think they want to develop as an application. In the vast majority
of cases, developers produce something that looks pretty on the
screen but has almost no practical application to a real person in
the real world. After all, how many people actually use the
hyperlinking features in MS Word? Sure it might be useful, but it's
a whole lot more a waste of time; a problem for people to learn; and
an interference. There are much better Web authoring applications in
the market place.
5) Being all things to
all people is another major destructive course in technology. Excel
is a spreadsheet, not a database. But because Microsoft made Access
too complicated for 99.9% of humanity, anyone who wants just a
simple database ends up having to use Excel. Although it manages
simple lists, it isn't a database. End-users now have no easy way to
manage, develop, or use a database. Let's all try and get back to
the place where a single application does a single thing, perfectly,
simply, and elegantly.
6) Destructive
competition, profiteering, and a belief in a zero-sum economy are
another catastrophic failure in paradigm. There's plenty of room for
competing products without one company trying to destroy another.
Without competition, everyone loses. Microsoft's destructive
competition with Netscape has ended up with a mediocre browser,
filled with security holes, and not a single innovative change to
how we use the Web. (ActiveX, for all everyone thinks it's fabulous,
is really just a bunch of fire and flames. It doesn't do anything
useful, for the most part.)
7) Ultimately, technology
is just junk in a box. There's a fundamental principle of
engineering: Form follows Function. How does a computer follow
function? It isn't natural for someone to use a keyboard to input
information. We interact with voice, sound, touch, and vision.
Sometime soon, computers and robotics will merge, At that point,
almost every IT professional today will be out of business—unless
they're involved in systems implementation, and using technology to
do things! Not just being a mechanic.
8) Another great anecdote
has to do with the CEO who wants to know what strategies his VPs
have come up with in a business situation. All the managers run off
to their spreadsheets and come back with "the numbers." They spew
out their scenarios, and say that's what the numbers show and the
computers show. But the CEO isn't at all interested. He wants to
know, based on those numbers, what the people think! IT
professionals have lost almost all sense of context. We all of us
need to really sit back and think about how whatever it is we're
doing has any meaning or bearing at all on the real world.
9) When Microsoft finally
collapses, it's going to send a tidal wave through the entire
industry. Intel might likely follow, having bet the farm on their
Itanium chips and limit of only two processors. AMD and IBM are
forming up on the horizon, and there's Linux out there, along with
the entire Open Source movement. Windows XP is one of the worst
"kludge" operating systems, but Microsoft has use monopolistic
strategies to force it on people who don't have the time or energy
to work out something else. It's a house of cards, just about ready
to come tumbling down. Anyone in the IT business who's counting on a
future based on the present is probably going to be in for distress.
Focus on the underlying principles of information theory, not on the
specific applications of today's technology. We're in the infancy of
computer technology, poised to make the next evolutionary leap.
That'll happen, hopefully within the next 5-10 years, well within
our lifetimes, and anyone who's not prepared for it, may as well
open a buggy-whip manufacturing plant.
10) The entire field of
IT is going to have to get around to portable knowledge. It's common
knowledge that the lifetime for a technical person is around 10
years before they get totally burnt out. The reason for the burn out
is that nothing they learn at one point, transfers to the next
"thing" that comes out. Almost none of my training in Windows 3x
means anything in terms of XP. Nothing I know about an AT machine
translates to a modern Pentium. No other field in the world is
organized that way, for the most part. Airplanes may be very
different today from 100 years ago, but the principles of air,
flight, propulsion, fuel, landing and taking off, are still the
same. We need to begin developing technical tools that carry forward
underlying principles from previous knowledge, and try to cut back
on the whole concept of "obsolete" technology.
Q: List the 5 best
resources for technology and business professionals.
A: 1) I'd say that
hands-on experience is still the very best resource for anyone.
2) Books are also still a
very good, reliable, easily-accessible resource. The Reference
Manual being the best of all. I believe RTFM is still a fairly
well-known acronym, for most tech people.
3) The workplace: I can't
even begin to count the number of times I've thought something was
impossible. It's only when a non-technical person asked me to put
something together for them, that I came to understand the maxim:
There's always a workaround!" And I learned all kinds of things I
never would've attempted to learn, knowing, as I did, that they were
"impossible."
4) The Internet. There's
no question that a world-wide resource is an incredible resource.
The problem is: why should anyone bother to go through all that
effort to put up a Web site with white papers, explanations, or
whatever, for free? We're still trying to work that out, but for the
moment, search engines and the Web are invaluable.
5) Gamers! Anyone who's
older than around 40, began with computers back when PC Magazine was
the top source for innovative technology. But just as the entire
industry is in quick-speed evolution, so too does that mean that we
become "The Establishment" much faster. Today's creative minds have
been almost frozen out of the established business by old, staid,
rest-on-their-laurels sources left over from 10 years ago. That's
not where it's at, to use a 60s expression. Listen to the hackers,
gamers, and over-clockers. That's where the future lies.
That's about it, in terms
of resources. Understanding technology isn't something you do by
just going to the library. Either you have a facility for it—a
passion—or you don't. If you've got that passion, then you'll take
the time to ask "Why?" all the time. Then ask "How?" Technology
isn't for parrots, or monkeys pushing buttons in a predefined
sequence that someone else laid down on paper. We're still at only
the most simplistic level of the blend between the human mind and
the mechanical machine. Whatever develops your imagination: that's a
superb resource. It isn't the box that matters: it's coming up with
things to do with that box that counts.
Q: What kind of computer
setup do you have?
A: Hah! I have an old
Micron, with a Pentium 133 and 4GB hard drive. Until recently, that
was all I needed to write books, surf the Web, email my friends, and
do business. Now I'd like to get into an idea I have for an animated
video about computers, and that old 133MHz seems a bit slow for
video editing. Actually, I'm just tired of the never-ending stream
of "new" this or that. I've been far more busy doing things, living
my life, and putting ideas into effect. Now that I'm going to have
to upgrade, I guess I'll set aside the waste of time it'll take to
reinstall everything and get it to where I like it. I'm looking at
an Athlon, probably with a fairly large hard drive, and probably a
DVD-RW of some kind. But I probably won't spend more than about $400
for the whole thing. I already have a video card, and a 19" LCD
panel.
Q: Craig, thank you again
for your time, and consideration in doing this interview.
A: You're very welcome. I
hope it helps someone, somewhere along the line. And of course, I'd
be pleased if everyone who reads this, also buys the “A+ Exam Cram
2”, available in fine book stores, everywhere. By the way, Que has
included a nifty CD in the back cover, with some video lectures by
Scott Mueller, one of the great hardware guys in the industry. His
book about upgrading and repairing PCs, is a standard, also
published by Que. |