• Tidak ada hasil yang ditemukan

INVOICE

4.7 SYSTEM ARCHITECTURE

Exercise 2 feedback a)

b) Whilst the code appears to be informative, indicating the course and year, as the student moves through the years the code would be misleading and if it was changed it would cease to act as a permanent unique identifier. A similar problem would arise if the student changed courses.

c) An exception report – the report would only be showing the details for a single student.

A number of factors can impact on the architecture chosen for a system, including existing corporate infrastructure, scalability to allow for growth in data volumes and number of users, any legacy (old existing) system requirements and any web integration needs. System security requirements may also have an impact on the infrastructure design. Initial costs and total cost of ownership will also need to be considered against the technical needs.

Traditionally, early commercial computing systems (1960s–1970s) were supported by mainframe (or mini) computers, usually based in and maintained by an organisation’s central IT/computing department. These computers hosted all the organisation’s systems and data and users interacted with them via dumb (no real processing capability) terminals that were connected directly to the mainframe. Mainframes are still used by some organisations which need to process large volumes of data at one location, e.g. banks. Mainframes were initially used with batch processing systems which involved entering large volumes of data and then processing the batches of data, often overnight, to update the main system files.

Whilst this approach to data processing is suitable for less time-critical applications, it has been superseded in many cases by online transaction processing systems which process the entered data immediately in order to ensure that all data files are kept current.

The arrival of business personal computers (PCs) in the 1980s allowed users to host and run their own software and keep their own data, e.g. word processors and spreadsheet applications. Eventually these stand-alone PCs were connected to networks which allowed them to exchange data. Although stand-alone PCs provided some benefits they created issues relating to security and data consistency as different users may have used different versions of the same database.

Eventually local area networks (LANs) appeared. These allow stand-alone PCs (referred to as clients) to be connected to computers acting as servers which hold programs and data. These LANs also allow printers, document scanners and other specialist devices to be connected for sharing. Wide area networks (WANs) also exist, which allow the connection of clients or LANs over very large distances, e.g. in different countries. Systems using networks are often referred to as distributed systems and in most cases when a user accesses data via a network connection they are unaware of its underlying architecture.

Download free eBooks at bookboon.com

There are a number of network configurations, although they are all regarded as client-server designs, and expert network architects are often used to specify and maintain complex networks. In a client-server configuration, the client sends a request for data over the network to the server. The server then processes the request and extracts the required data which is then sent back via the network to the client. Clients are often referred to as fat or thin. A fat (or thick) client is one which handles most of the application processing and is often a PC or laptop computer. A thin client does very little processing and instead relies on the server carrying out the processing, which usually results in better performance and a lower hardware cost. A thin client maybe a simple dumb terminal consisting of a screen, keyboard and mouse with no or limited processing capability.

Client-server designs are generally regarded as two-tier or three-tier. The clients in both designs handle the user interface. In a two-tier design the application processing is shared between the server and client, with all the data being stored on the server. In a three-tier configuration an application server sits between the client and the data server. This middle layer application server handles the client requests and breaks them down into data access commands that are handled by the data server. Some more complicated designs have multiple layers; these are referred to as n-tier. In multi-tier networks specialist middleware software allows for data transfer between the different levels, including enterprise applications and web services.

Download free eBooks at bookboon.com Click on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read moreClick on the ad to read more

American online LIGS University

enroll by September 30th, 2014 and

save up to 16% on the tuition!

pay in 10 installments / 2 years

Interactive Online education

visit www.ligsuniversity.com to find out more!

is currently enrolling in the Interactive Online BBA, MBA, MSc,

DBA and PhD programs:

Note: LIGS University is not accredited by any nationally recognized accrediting agency listed by the US Secretary of Education.

More info here.

The choice of network configuration, i.e. data storage location and where the processing takes place, can have a significant impact on system performance and is often a balance between centralisation and decentralisation.