Location>code7788 >text

01. Computer Composition Principles and Architecture

Popularity:358 ℃/2024-07-24 17:45:58

01. Computer Composition Principles and Architecture

Catalog Introduction

  • 01. Underlying computer knowledge
    • 1.1 Basic computer components
    • 1.2 Integration of theory and practice
  • 02. Basic computer hardware
    • 2.1 Basic Hardware Composition
    • 2.2 Input and output devices
  • 03. von Neumann Architecture
    • 3.1 Memory program computers
    • 3.2 Von Neumann description of computers
    • 3.3 Abstract computer framework
    • 3.4 Extension of the von Neumann system
    • 3.5 Synthesizing cases to understand concepts
    • 3.6 Data interaction level design
    • 3.7 Data flow level design
    • 3.8 Control Level Design Ideas
    • 3.9 Let's look at a practice problem
  • 04. Principles of Computer Composition
    • 4.1 Computer Knowledge Map
    • 4.2 Basic computer components
    • 4.3 Controller
    • 4.4 Memory
    • 4.5 Input devices
    • 4.6 Output devices
    • 4.7 Operators
  • 05. How to learn the principle of composition
    • 5.1 The body of knowledge is too big
    • 5.2 How to study the column

01. Underlying computer knowledge

1.1 Basic computer components

  • A computer is a piece of hardware that consists of a CPU, memory, and a monitor.
    • Nowadays, most programmers are engaged in various kinds of software development. Obviously, there is a need for a bridge between hardware and software, and the Principles of Computer Composition plays such a role by isolating the software from the hardware and providing an interface that allows the software to manipulate the hardware without having to care about the hardware.
    • With a principled understanding of hardware, you can rely on its reliability to write programs in a high-level language. Whether you are writing hard-core code like operating systems and compilers, or application-level code like web applications and mobile apps, you can do so with confidence.
  • No matter which core computer courses you want to learn, you should first study the "principles of computer composition", so that both the principles of computer hardware, or software architecture, all aspects of computer knowledge will have a global understanding.
    • image

1.2 Integration of theory and practice

  • So much has been said about the importance of the principles of computer composition, but how exactly do you learn them?
    • "Buying books is like a mountain, reading books is like drawing silk." Practicing for so many years, around the engineers who want to learn the principle of composition of many, but really insist on learning to finish, learn well is not much.
  • Three main reasons for these problems were identified from the experience of study and work.
    • First, the breadth. There are a lot of concepts in the Principles of Composition, and the amount of information in each concept is very large. For example, if you want to understand how the arithmetic logic unit (also known as the ALU) in the CPU realizes addition, you need to involve how to represent integers as binary, and you also need to understand the circuits behind these representations, logic gates, CPU clocks, triggers, and so on.
    • Second, deep. Many of the concepts in the principles of composition, when elaborated upon, are another core course in the discipline of computing. For example, how do computer instructions go from a high-level language like C or Java that you write to machine code that the computer can execute? If we expand and go deeper into this question, it becomes a core course like Principles of Compilation.
    • Third, learning can not be used. Learning something is to be used, but because of the attributes of the course itself, many people in the study, often indulge in concepts and theories, and can not be linked to their daily development work, as a way to solve the problems encountered in the work, so learning is often no sense of accomplishment, it is difficult to have the motivation to persevere.

02. Basic computer hardware

2.1 Basic Hardware Composition

  • In the early years, to assemble a computer yourself, you had to start with three major components, CPU, RAM and motherboard.
    • CPU is the most important core component of a computer, the full name of which you surely know, called the Central Processing Unit (CPU). Why is the CPU "the most important"? Because all of the computer's "calculations" are carried out by the CPU. Naturally, the CPU is also one of the most expensive parts of the computer.
    • Memory (RAM), the programs you write, the browsers you open, the games you run, all have to be loaded into memory to run. The data that the program reads, and the results that you get from calculations, all have to be put in memory as well. The more memory you have, the more things you can naturally load.
    • Motherboards, programs and data stored in memory, need to be read by the CPU, and after the CPU has done its calculations, it has to write the data back to memory. However, the CPU cannot be plugged directly into the memory, and vice versa. This brings us to the last big piece - the motherboard.

2.2 Input and output devices

  • With the three main components, as long as the power supply is equipped with power supply, the computer is almost ready to run. But now there is a lack of various types of input (Input) / output (Output) devices, which is often referred to as I/O devices.
    • If you are using your own personal computer, then the monitor is certainly essential, only with a monitor we can see the computer output of a variety of images, text, which is also known as the output device.
    • Likewise, both the mouse and keyboard are essential accessories. So that I can enter text and write this article. They are also known as input devices.

03. von Neumann Architecture

3.1 Memory program computers

  • Because, whether it's a PC, a server, a smartphone, or a miniature card machine like the Raspberry Pi, they all follow the same abstract concept of "computer".
    • What kind of computer is this? This is actually the Von Neumann architecture proposed by John von Neumann, one of the grandmasters of computers, also known as a stored-program computer.
  • What is a stored program computer?
    • There are actually two concepts implied here, a "programmable" computer and a "storage" computer.
  • When it comes to "programmable", I guess you are a bit confused, you can think about what is "non-programmable".
    • A computer is made up of a combination of various gate circuits, which are then assembled to complete a specific computational program by assembling a fixed circuit board. Once the functionality needs to be modified, the circuitry has to be reassembled. In this case, the computer is "non-programmable" because the program is "written to death" at the computer hardware level.
    • The most common is the old-fashioned calculator, where the circuit board is set up to add, subtract, multiply, and divide, and can't do anything outside of the logical fixes of the calculation.
  • Let's look at the "memory" computer. This actually means that the program itself is stored in the computer's memory and can be loaded with different programs to solve different problems.
    • There are "stored-program computers" and, naturally, there are computers that cannot store programs.
    • Typical is the early "Plugboard" such as the plugboard type of computer. The entire computer was a giant plugboard, and different functions were realized by inserting wires into different plugs or ports on the board. Such a computer is naturally "programmable", but the program written cannot be stored for the next load, and has to be re-plugged and re-programmed every time it needs a different "program" than the current one. programmed".

3.2 Von Neumann description of computers

  • Grandmaster Fung, describes what he has in mind a computer should look like.
    • The first is a Processing Unit (Processing Unit) containing an Arithmetic Logic Unit (ALU) and a Processor Register (Processor Register), which is used to perform various arithmetic and logical operations. Because it can accomplish various data processing or calculation work, so some people also call this as a data path (Datapath) or operator.
    • Then there is a Control Unit/CU containing an Instruction Register and a Program Counter, which controls the flow of the program, usually by branching and jumping between different conditions. In today's computers, the Arithmetic Logic Unit above and the Control Unit here, together make up what we call the CPU.
    • Then there is memory for storing Data and Instruction. As well as larger amounts of external storage, which in the past might have been devices like tapes and drums, but nowadays is usually hard disks.
    • Finally, there are the various input and output devices, and the corresponding input and output mechanisms. No matter what kind of computer we are using now, we are actually dealing with input and output devices. The mouse and keyboard of a personal computer are input devices, and the monitor is an output device. The touch screen of our smartphones is both an input and an output device. The servers that run on various clouds use the network for input and output. At this time, the network card is both an input and output device.

3.3 Abstract computer framework

  • Any component of any computer can be categorized into operators, controllers, memories, input devices, and output devices, and all modern computers are designed and developed based on this infrastructure.
    • All computer programs, in turn, can be abstracted as reading input information from an input device, executing the program stored in memory through operators and controllers, and ultimately outputting the results to an output device.
  • All the programs we write whether in high or low level languages also operate on the basis of such an abstract framework.
  • Von Neumann architecture, also known as Princeton architecture, is a design concept that stores programs and data at different addresses in the same memory, with the following core features:
    • A computer consists of five parts: an operator, a controller, a memory, and input and output devices
    • Data and programs are stored indiscriminately in memory as binary code, the exact location of which is determined by the memory address
    • Sequential execution of programs, i.e., when a computer works, the program to be executed and the data to be processed are first stored in memory (RAM), and then the instructions are automatically and sequentially taken out of memory and executed one by one.
    • image

3.4 Extension of the von Neumann system

  • The von Neumann architecture establishes the infrastructure of the computer hardware we now use every day. Therefore, learning the principles of computer composition is really about learning and dismantling the von Neumann architecture.
    • Specifically, learning the principle of composition, in fact, is to learn the working principle of the controller, the operator, that is, how the CPU works, and why it is designed in this way; learning the working principle of the memory, from the most basic circuitry, to the upper level of abstraction to the CPU and even the application of the interface is how; learning how the CPU and the input device, the output device to deal with.
    • To learn the principles of composition is to understand how the hardware, from controllers, operators, memories, input devices, and output devices, from the circuits, to the interfaces that are ultimately open to the software, works, why it is designed the way it is, and how it can be used as well as possible at the software development level.

3.5 Synthesizing cases to understand concepts

  • Suppose that during a vacation, Xiao Yang suddenly remembered a friend he hadn't contacted for a long time, and randomly opened QQ in his computer, typed the message he wanted to send in the dialog box, and then sent it to his friend. The diagram below describes the process emotionally:
    • image
  • Understand von Neumann architecture through this comprehensive case study
    • Xiao Yang opened the QQ software, this time before the program runs, you need to load it into memory. This is determined by the architecture, the cpu can only read data from memory, not directly and peripherals to get "data".
    • Written code to be compiled into binary instructions, is to be handed over to the cpu for processing, because the cpu to complete the work according to the instructions received, when receiving binary instructions, analyze the data to be processed, according to the instruction set and their own "comparison" to determine how to "deal with! "!
    • Yang enters a message through the keyboard, a process called an IO process (input/output). The message will be transferred in memory via binary, and this involves the process of memory and peripheral interaction
    • The message is sent over the network to the friend, who receives the message, the binary data needs to be serialized, and the data is first loaded into memory. This process of solving the data is carried out in the CPU, which does not "deal" with peripherals, but only with memory.
    • Finally, the message is printed to the display. This is the output process, the output goes through the IO process and is finally presented in the form of an interface

3.6 Data interaction level design

  • Von Neumann architecture, how it is designed at the data interaction level
    • The CPU doesn't deal directly with peripherals, it only interacts with memory
    • All peripherals can only be loaded into memory if there is data to be loaded, and the data written out of memory must also be written to the peripherals
  • The purpose of this is to increase the efficiency of the whole machine
    • Reason:We all know the barrel principle, which is that the amount of water that can be held inside a barrel depends not on the longest plank but on the shortest one.
    • Computers are similar in that the CPU is like that longest plank, and if every time the CPU interacts with a peripheral when it comes to data interaction, it's useless for the CPU to be that fast.
    • So when the data is interacted, the data in the peripheral should be loaded into the memory before the CPU needs it, and when the CPU needs it, it will go to the memory directly to get it. Memory is much faster than peripherals, so it can improve the efficiency of the whole machine.
  • Why is it important to load memory before a program runs?
    • Because the CPU has to execute our code and access data, it can only access memory. This is what the architecture dictates.

3.7 Data flow level design

  • For example, now that you are having a QQ chat with your online friend, what hardware did you go through between the time you sent a message and the time your online friend received the message?
    • image
  • What a complete data flow looks like
    • First you enter the message through the keyboard, load it into the space in memory that belongs to QQ, then process the message through the CPU (encrypting it or something like that) and then write it back into memory, which writes the data to the network card.
    • It is sent over the network to the network card of the networker's computer, and then the received data is written to the memory, processed by the CPU and written back to the memory, and from the memory to the monitor.

3.8 Control Level Design Ideas

  • How does memory know that there is data in the peripheral that needs to be loaded into memory when we enter it through the keyboard?
    • When is the data in memory written to disk (definitely not all the time, it's too slow) i.e. when is the buffer emptied.
    • 1_3_8_image
  • The black arrows here are control signals, which means that the CPU's controller is responsible for our operations above.
    • And whose instructions is the controller executing to control these peripherals and memory? The actual operating system (OS), which is behind the scenes all the time helping the CPU make decisions.

3.9 Let's look at a practice problem

  • EXAMPLE: The basic way a von Neumann machine works is:
    • A. control-flow driven approach; B. multi-instruction, multi-data-flow approach; C. micro-program control approach; D. data-flow driven approach
  • Answer: A; The von Neumann machine works in what can be called a control flow (instruction flow) driven manner.
    • That is, the instructions are read in sequence according to the execution sequence of the instructions, and then the data is called for processing based on the control information contained in the instructions.
    • B is a multiprocessor, and the von Neumann machine is a single-instruction-flow and single-data-flow system; the data-flow-driven approach refers to the fact that a single execution of a corresponding instruction can be stimulated only when all of the operands required for an instruction or a group of instructions are ready to be executed, and the results of the execution in turn flow to the next instruction or group of instructions waiting for this data to drive the execution of that instruction or that group of instructions.
    • Thus, the order of execution of the instructions in a program is determined solely by the data dependencies between the instructions.
  • The concept of a stored program is one in which instructions are entered into the computer's main memory beforehand in the form of code, and then the first instruction of the program is executed according to its first address in the memory, and thereafter the other instructions are executed in the prescribed order of the program until the end of the program's execution.
    • 1_3_9_image

04. Principles of Computer Composition

4.1 Computer Knowledge Map

  • As you can see from this diagram, the entire principle of computer composition revolves around how computers are organized to function.
    • image

4.2 Basic computer components

  • What hardware does a computer consist of. How does this hardware, in turn, correspond to the five basic components of the classical von Neumann architecture, that is, the operator, the controller, the memory, the input device, and the output device.
    • In addition to this, you need to understand two core metrics of computers, performance and power consumption. Performance and power consumption are also factors that we need to focus on in applying and designing the five basic components.
  • In the computer instructions section, you need to figure out how the lines of C, Java, and PHP programs we write every day run inside the computer.
    • In this, you need to understand how our program through the compiler and assembler, into a machine instruction such as the compilation process (if the compilation process is expanded, it can be turned into a complete compilation principles of the course), but also need to know how our operating system links, loads, and executes these programs (this part of the knowledge if you study more in-depth, but also can be turned into a course on operating systems). .
    • The process of controlling the execution of this one instruction is controlled by the controller, one of the five major components of the computer.
    • In the computational part of the computer, it is important to start with binary and coding to understand how our data is represented in the computer and how we are able to implement the basic arithmetic functions of addition and multiplication from the digital circuit level.
    • The ALU (Arithmetic Logic Unit/ALU), or Arithmetic Logic Unit, which implements these operations, is actually one of the five major components of our computers, the operator.

4.3 Controller

  • The controller is the command center of the computer, from which the components are directed to work automatically and in a coordinated manner. The controller consists of program counter PC, instruction register IR and control unit CU.
    • The PC is used to store the address of the current instruction to be executed, and can be automatically +1'd to form the address of the next instruction, and has a direct path to the MAR in the main memory.
    • IR is used to store the current instruction, the contents of which are taken from the MDR of the main memory.The opcode OP(IR) from the instruction is sent to the CU to analyze the instruction and issue various sequences of micro-operation commands; the address code Ad(IR) is sent to the MAR to fetch the operands. (The opcodes represent the various operations performed by the machine, and the address codes represent the location in memory of the numbers participating in the operation.)
    • 1_4_3_image

4.4 Memory

  • Memory is the storage component of a computer that holds programs and data.
    • Memory is divided into primary memory (main memory, internal memory, RAM) and secondary memory (auxiliary memory, external memory, external storage).
    • The memory that the CPU can access directly is the main memory. Auxiliary memory is used to help the main memory memorize more information, and the information in the auxiliary memory must be transferred to the main memory before it can be accessed by the CPU.
  • The main memory includes the memory body M, various logic parts and control circuits.
    • A storage body consists of a number of storage units, each of which contains a number of storage elements (or storage primitives, storage elements), each of which stores a single binary code 0 or 1.
    • Thus a memory cell can store a string of binary code, called a memory word, and the number of bits in the string is the memory word length, which can be 1B or an even multiple of a byte. A memory word may represent either a number, a string of characters, an instruction, etc.
    • The main memory works in the following way: access by the address of the storage unit, this access method is called the access method by address, that is, access the memory by address (referred to as seek by address, access memory).
  • Memory holds binary information
    • The address register (MAR) holds the access address and finds the selected memory cell after address decoding.
    • The data register (MDR) is used to temporarily store information to be read or written from memory, and the timing control logic is used to generate the various timing signals required for memory operations.
    • image

4.5 Input devices

  • The primary function of input devices is to enter programs and data into a computer in the form of information that the machine can recognize and accept.
    • Common input devices are keyboards, mice, scanners, video cameras, etc.

4.6 Output devices

  • The task of the output device is to output the results of computer processing in a form acceptable to people or in the form of information required by other systems.
    • Commonly used output devices are monitors, printers, etc. Computer I/O devices are the bridge between the computer and the outside world.

4.7 Operators

  • Operators are the execution parts of a computer that are used to perform arithmetic and logical operations.
    • Arithmetic operations are operations performed according to the rules of arithmetic operations, such as addition, subtraction, multiplication, and division; logical operations include operations such as and, or, not, different or, compare, and shift.
    • The core of the operator is the arithmetic logic unit ALU (Arithmetic and Logical Unit) The operator contains a number of general-purpose registers for temporary storage of operands and intermediate results, such as the accumulator ACC, the multiplier-quotient register MQ, the operand register X, the variable-address register IX, the base-address register BR, etc., of which ACC, MQ, X is required.
    • There is also the Program Status Register PSW, also known as the Flag Register, within the operator, which is used to store some flag information or processor status information obtained from ALU operations.
    • 1_4_7_image

05. How to learn the principle of composition

5.1 The body of knowledge is too big

  • Principles of Computer Composition is more of an "outline" for the entire computer science discipline than for any other subject in computer science. Any one of the knowledge points in this course can become a core course in computer science if you dig deeper.
  • For example, how to turn a program from high-level code into instructions to run in a computer corresponds to the courses "Principles of Compilation" and "Operating Systems"; behind the computational implementation is "Digital Circuits"; to optimize the CPU and memory system, it is necessary to understand the "Computer Architecture" in depth. "If you want to go deeper into the optimization of CPUs and memory systems, you must have a deeper understanding of "Computer Architecture".

5.2 How to study the column

  • First, learn to ask yourself questions to connect the dots. After learning a point, ask yourself the following two questions.
    • How do I write a program that goes from typed code, to a running program with a final result?
    • What exactly are the steps experienced at the calculator level throughout the process, and what can be optimized?
  • Internalizing knowledge by teaching and learning
    • Whether it's the compilation, linking, loading, and execution of a program, as well as the logic circuits and ALUs needed for computation, or even the pipelining, instruction-level parallelism, and branch prediction that the CPU spontaneously does for you, as well as the corresponding access to the hard disk, the memory, and the data that's loaded into the cache, all of this corresponds to a single point of knowledge we're learning.
    • It is advisable to go through it in your own head, preferably verbalize it or write it down, as this will be very helpful in grasping all these points thoroughly.
  • Write some sample programs to verify knowledge
    • Computer science is a practical discipline. A large number of principles and designs in computer architecture correspond to the word "performance". Therefore, it is a good way to integrate these knowledge points by documenting them in a sample code program that compares performance.
    • This is because a sample program with clear performance comparisons will leave a deeper impression in your mind than memorizing points. When you want to review these points, a program is also more likely to prompt you to retrieve it from the depths of your mind.