See the detailed document for more specific recommendations and requirements that are derived from these general policies.
Organizations have computer systems to assist them in accomplishing their primary mission. Like any other tool, there is a risk that the computer system will break or be broken, resulting is a reduced ability to accomplish the primary mission. This may end up costing the organization money in lost productivity, endangering the organization's reputation with their customers, etc. Computer security is about mitigating the risk of harm to an organization if their computer systems are broken (either accidentally or on purpose).
Historically the focus of computer security has been on reducing the chance that computer systems can be broken. This is still the primary focus, but there is starting to be discussion in the computer security community about covering the remaining risk with insurance. Some computer security insurance policies are on the market, but it is still an unanswered question how insurance companies will assess the quantity of risk in any given environment or whether insurance can realistically cover this type of risk.
Networks exist to trasmit information between computer systems. The receiving system may not always have requested, nor want, or, worst of all, be prepared to handle the information delivered to it. As such, the network should be designed to limit access from untrusted systems to trusted systems. In addition, networks should be monitored to ensure that unwanted traffic does not slip past network defenses, or that systems believed to be trusted are not acting in an unexpected fashion.
Hosts exist to perform some function. For some machines this function is well defined (a web server or a router for example). For other machines this is much less clear (a desktop system). As much as possible, systems should have a clearly defined purpose. Otherwise monitoring and auditing the system to see if it is performing as intended is difficult or impossible.
As a general rule, computer systems should be designed to resist attack and accidental misuse as much as possible, and fail gracefully in the face of problems that can't be avoided.
Hosts should only offer the services and applications necessary to perform their function. Unless a service is distributing explicitely public information, it should authenticate entities requesting access using authentication methods that are as strong as is feasible. Access to information and system resources should only be granted as necessary.
Computer systems should be designed to survive various failures in accordance with the risk associated with their unavailability. Likewise, data should be protected in a manner consistent with the risk associated with its loss. (In other words, don't spend thousands of dollars on duplicate computer systems or offsite backups if the primary system or data isn't very valuable. But do spend the money on data that would be expensive or impossible to reproduce or systems that are critical.)
Computers should log their activity. This facilitates reporting on how systems are being used, detecting misuse of systems and performing forensic analysis in the event of a security incident. Computers should also be monitored and audited to ensure that they are performing their intended function and nothing else.
Above all remember that computer security is an ongoing process. Sufficient system administrator resources must be available to ensure ongoing compliance with this policy. Computers which can not be adequately maintained should be shut down.