Sunday 22 May 2011

PROGRAMMING AND LIFE CYCLE SOFTWARE

Introduction
The software is an integral part of current computer systems.  The development of current computer technology be sure to include the development of software. Development of software development experience significant progress since 6 last decade: starting from just give into the binary instructions a computer system is simple,
manufacture of low-level programming language, intermediate level until a high level.  In addition, because of the increasing complexity the need for a computer as a tool for  computing, software development also has spawned the concepts programming, ranging from simple programming concept (only write the lines of the program that runs from the beginning to end of the program), manufacturing procedures, to the object-oriented programming. An object-oriented programming concepts a relatively new programming, where the programming is directed to paradigm of the formation of objects that interact with each other.  In addition the concept is more easily digested by the programmer, both lay though, the concept of object oriented programming easier maintenance software so that software becomes more flexible when will be revised or developed.

Programming
Definition
Programming is a process of "planting" instruction to in the computer such that in operation, The computer will refer to the instruction given. Programming process will produce a product that called a "program".  Different programs will cause computer gives different results for an input same.  For example, let's say we have 2 pieces of Java programs as follows:
a.  Addition class, which is a program that will add 2 pieces of input provided to him, and print to the screen.
b.  Multiplication class, which is a program that will multiply two input fruit given to him, and printing it to monitor screen. For example we enter 2 pieces of input pairs {1, 5} to in each program. 
1.2.2 Programming Languages
Instructions given to the computer basically Another is a form of binary code, namely a series of binary codes (Which consists of the numbers "0" and "1") that can "Understood" by computers.  Binary instructions that can be "understood" by computer is the instruction that has a relation with operation
elementary that can be done by computer, for example operating "Save 1 byte to a specific address in memory", "play hard", and so forth.  Each computer has a code operating on a particular binary.
In the early development of computer systems, programmers ( programmer), computer programming in a way entering binary values ​​into computer memory.  Values binary included illustrates the algorithm of operation should be done by computer.  This method has several difficulties, among others:
a.  Before entering the correct binary code, programmers need to check the mapping code
binary with the desired computer operation.
b.  When an error occurs, the programmer must do the work extra form of checking the wrong code and pemetaanulang the relationship between binary code with the operating computer. Along with the growing development of computer architecture and the increasing number of computer operations can be performed, planting methods of instruction by using the binary code starting perceived not practical, because it is difficult to translate into the language human.  Therefore, a device that can be made converting instructions from the instruction that can be understood by the language humans into instructions understood by the machine / computer. For example:
a.  Instructions are understandable by humans: "b = 1 4;"
b.Instruksi understood by computer: "... 1001101010101011100..." From here began the term "software" which basically is one device that acts as a regulator of execution operations on the computer. Software that is made to date written use "programming language".  Programming language can defined as a set of instruction patterns that can be understood by humans, who used to write programs / application / software to produce a specific operation on computer.
1.2.3 Level Programming Language
Definition of "language that can be understood by humans" actually a relative sense.  For programmerprogrammer binary code, the bar code "1001100101011....."  probably understandable.  But for programmers who are familiar with mnemonic, lines of code such as "MOV A, # 25" can still
understood.  And for Java programmers, such as lines of code "String s = new String (" Hello World ");" is more understandable than binary or mnemonic. Here comes the ranking (leveling) language programming.  Lower programming language is language "tends to be understood by computers."  Hereas higher language is a language which "tend to be more understood by humans ". There are several levels of language programming, among others:
a.  Low-level language (the low-level language), eg machine language (binary), and assembler
b.  Intermediate language (medium-level language), for example the language C / C, Fortran.
c.  High-level language (high-level language), such as language Pascal.
d.  Higher level language (higher-level language), eg language Java, DotNet. Characteristic of the lower level language is user ability to manipulate operations on the hardware level ( for example, fill, edit, and delete data in memory and registers) is higher.  While the language level more high, the ability of users to manipulate the operation level lower hardware.  Even at the higher-level languages ​​such as Java,
user really can not do the manipulation at the level of hardware directly, because operations on the hardware ( such as allocating memory, the erasing of data in memory) is done automatically by the Java Virtual Machine (JVM).

No comments:

Post a Comment