Showing posts with label #students. Show all posts
Showing posts with label #students. Show all posts

Tuesday 6 August 2024

MANAGEMENT INFORMATION SYSTEM

Management levels
In an organization, there are three different levels of management, each of which requires different types of information systems at each level. They are:
Top-level managers: these are the senior executives of an organization and are responsible for its overall management. Often referred to as strategic managers, focus on long-term issues and emphasize the survival, growth and overall effectiveness of the organization. They make strategic decisions like mergers and acquisitions, new product planning, capital investment etc.
Middle-level managers: they are called tactical managers and are responsible for translating the general goals and plans developed by strategic managers into more specific objectives and activities. The role of middle manager is to be an administrative controller who bridges the gap between higher and lower levels. They make tactical decisions which include pricing, capacity building, budget preparation, purchasing contracts etc.
Frontline/operative managers: they are called operational managers or low-level managers who supervise the operations of the organization. These managers often have titles such as supervisors or sales managers. They are directly involved with implementing the specific plans developed by middle managers. Some of the operational decisions include; production scheduling, sales, ordering and credit approval. The relationships among the three levels is presented below:

Decision types in an organization
An organization has a wide variety of decisions to make, ranging from highly structured decisions to unstructured decisions.
Structured decisions: is one that is made quite often, and one in which the decision is based directly on the inputs. With structured decisions, once you know the necessary information, you know the decision that needs to be made. For example, inventory reorder levels can be structured decisions. These decisions are routine, repetitive and well-defined decisions with clear procedures and rules. The information required for making these decisions is usually readily available and can often be processed using standard methods or automated systems. Example reordering inventory when it reaches a certain level, payroll processing, financial reporting.
Unstructured decisions: these decisions involve a lot of unknowns. They are generally based on criteria that are not well-defined, and information is more likely to be ambiguous or incomplete. The decision-maker may need to exercise some thoughtful judgment and creative thinking to reach a good solution. 
More also, these decisions are novel, non-routine and complex with no clear procedure for making them. Example, setting a company’s strategic decisions, designing a new product and crisis management decisions.
Semi-structured decisions: are decisions which most of the factors needed for making the decision are known but human experience and other outside factors may still impact the decision. A good example of this decision is hiring process. Here, these decisions fall between structured and unstructured decision. They contain elements that are both clear and routine, as well as elements that require human judgment and analysis. In the hiring process, part of the decision is structured, like years of experience, education level etc., and part of the decision is based on human experience like social skills, problem-solving skills.
Types of Information System (IS)
Transaction Processing System (TPS)
This is a computerized system that enables operational level managers to carry out their day-to-day activities. These activities include records of daily transactions: sales, order entry, bill processing, shopping details, hotel reservation and inventory of goods and services rendered. Their activities involve:
The sales/marketing departments, there is daily processing of sales/orders
The account department, there is daily processing of accounts details such as the accounts receivable/payable
The manufacturing department, there is daily schedule of activities such as inventory control, production scheduling and details of manufactured goods
The human resource department, staff movement is processed, vacancies are advertised, applications are treated and interviews scheduled.
Management Information System (MIS)
This is an information system that generates reports and information to aid the management level, particularly the strategic managers and to a lesser extent the tactical managers, to make efficient and effective decisions. The system takes input from the TPS and generates reports that serve to plan, control, organize and direct activities of the firm. The system generates reports in a weekly, monthly or yearly basis but not daily as in TPS.
Some features of the TPS include:
It has analytical capability
It aids decision-making using past and present data
It supports structured and semi-structured decisions at operational and management levels
It has internal rather than external orientation
Decision Support System (DSS)
This is an information system that serves purely the tactical level management. It is particularly very suited for unique or rapidly changing semi-structured decisions that cannot be specified in advance as against MIS. Unlike the MIS, DSS has analytical capability to sieve through massive amount of data to generate information for decision makers. The system is generally user-friendly, very interactive and very ideal for constructing ‘what-if’ scenarios with a view to making the best decision per time. DSS supports complex decision-making and problem-solving. 

The system analyzes large volumes of data to generate insights using sophisticated analytical models and data analysis tool. DSS supports complex decision-making and problem-solving. The system analyzes large volumes of data to generate insights using sophisticated analytical models and data analysis tools. 
Features of DSS
It offers flexibility, adaptability and quick response
It operates with little or no assistance from IT professionals
It provides support for decisions and problems whose solutions cannot be specified in advance
It uses sophisticated analytical and modeling tools.
Executive Support System (ESS)
This is an information system that serves the strategic managers purely. It has less analytical capabilities than the DSS, but can solve a wide array of problems. The ESS can be considered a more interactive MIS and has capabilities for DSS and artificial intelligence combined to assist managers to identify and address problems and opportunities. 
ESS helps the senior management to take strategic decisions. It gathers, analyzes and summarizes the internal and external information that is used in business processes. Example of ESS are; actions of various competitors, economic developments to support strategic planning.





Expert Systems (ES)
It is a system that provides an expert advice. It also acts as an expert consultant to users. They have specialized capabilities that enhance their analytical power and decision-making support.
In other words, expert systems are computer programs that mimic the decision-making abilities of a human expert in a particular domain or field. They are designed to solve complex problems and provide expert-level advice and guidance.
 The examples of ESs are:
Credit application advisor
Process monitor
Diagnostic maintenance system
Investment analysis
Inventory management
Employee recruitment
Financial forecasting
Customer Relationship Management (CRM) Systems 
The CRM manages a company’s interactions with current and potential customers. These systems help in tracking customer interactions, sales and support. Users of this information system include; sales, marketing and customer service departments.



Business Intelligence Systems (BI)
This is an information system that analyzes an organization’s raw data to provide insights for business decisions. These systems include data warehousing, data mining and reporting tools. Users of this information system include; managers and analysts who require data analysis.

MANAGEMENT INFORMATION SYSTEM
Concept of Information System
Information system is an organized combination of people, hardware, software, resources and communication networks. Information system is also a scientific study that covers strategic, management and operational activities. These three activities basically deal with the following activities:
Gathering of information
Processing of information
Storage of information
In other words, information system is a collection of hardware, software, data, people and processes that work together to transform raw data into useful information for the benefit of an organization.

Figure: components of IS
Components of Information System
Computer hardware: 
These are physical equipment used for input, output and processing. The hardware structure depends upon the type and size of the organization. It consists of an input and an output device, operating system processor, and media devices. This also includes computer peripheral devices. 
Computer software: 
The application program used to control and coordinate the hardware components. It is used for analyzing and processing of the data. These programs include a set of instruction used for processing information.  Software is further classified into three types: System Software, application software and procedures.
Databases
Data are the raw facts and figures that are unorganized that are later processed to generate information. Software are used for organizing and serving data to the user, managing physical storage of media and virtual resources. As the hardware can’t work without software the same as software needs data for processing. Data are managed using Database management system. Database software is used for efficient access for required data, and to manage knowledge bases. 
Network: 
Network resources refer to the telecommunication networks like the intranet, extranet and the internet. These resources facilitate the flow of information in the organization. Also telecommunication networks consist of computers, communications processors, and other devices interconnected by communications media and controlled by software.
Human resources:
It is associated with the manpower required to run and manage the system. People are the end user of the information system, end-user use information produced for their own purpose, the main purpose of the information system is to benefit the end user. The end user can be accountants, engineers, salespersons, customers, clerks, or managers etc. People are also responsible to develop and operate information systems. They include systems analysts, computer operators, programmers, and other clerical IS personnel, and managerial techniques.


Introduction to Management Information System (MIS)
Management information system is designed by an organization so that it can be used by the manager in performing the following functions; directing, planning, coordinating communicating and decision making.
MIS can also be said to be a structured system used within an organization to collect, process, store and disseminate information necessary for carrying out management functions.
Management concept
Is an art of getting things and activities done through people and with the people. It is normally performed by managers in an organization.
Operations performed by managers:
Plan by making strategies and goals
Organizing tasks: managers organize the tasks that are required to complete the work and group these tasks homogenously
Control performance: performance is controlled by managers by setting standards and must be made sure that standards are not deviated.
Information concept
Information is the data that has been processed, organized or structured in a way that provides meaning or value. It is an essential element in decision-making and communication.
One basic difference between data and information is that data is not used in decision making process, but information is used in the decision making process.
System concept
A system is a set of elements that work together to achieve a common objective. It helps in optimizing output of an organization. Subsystems are part of a larger system. Example, organization is a system, and departments like accounting department, human resources department, etc. are subsystems or the elements.


Role of MIS
Hardware administration:
Hardware administration involves managing and maintaining computer hardware and related infrastructure to ensure optimal performance and reliability example; monitoring and maintenance, repairs of faulty hardware devices, inventory management and security management.
Software development:
This is involving software engineers to develop needed software to be used in the organization. Here, IT experts in the organization may not likely have the requisite skill of software development. In this case, outsourcing will be applied.
End user support:
Ensuring every staff of the organization of the staff of the organization is aware of how to use systems, new and updated installed on the systems of the organization.
Others roles of the MIS are; data management, report generation and communication and collaboration.

MIS benefits
Provides better communication
Makes better use of resources 
Improves customer services and relationships
Helps in recording and processing all business dealings and transactions
Provides easy access of information to managers and staff 
It provides managers an interactive support in the decision making process.

Challenges of MIS
Data security: protecting sensitive information from unauthorized access and breached is one of the most challenging tasks of a MIS
System integration: ensuring seamless integration with existing systems and processes in another crucial challenge facing the use of the MIS
Cost: high initial set-up and ongoing maintenance costs can also be a barrier. Small organizations with low budget cannot afford
User training: ensuring that users are adequately trained to use the system effectively
Scalability: as organizations grow, their MIS must be able to scale accordingly. Ensuring the system can handle increased data volumes and more users without performance degradation can be challenging.
Interoperability: ensuring the MIS can effectively communicate and share data with other systems example, ERP, is essential for seamless operations.
Others are; obsolescence, user adoption, cultural barriers and data migration etc. 

INFORMATION PROCESSING
Data and Information 
Data is the record of the daily transactions of an organization example date, amount goods on invoices, detail of pay slips, number of hours of worked, number of vehicles produced or sold etc. it can be considered a series of digits, figures that represent an idea. They have no meaning on their own so they are raw-materials for data processing.
Information is an analyzed or processed data in a Meaningful form for decision making. They are related as follow:
    
    20, 25, 15 Arrange Name Sex Age
  Jane, Musa, Ibrahim Jane Female 20
  Female, Male, Male Musa Male 25
                    Ibrahim Male 15
The term data and information are often used interchangeably on the account that information produced today can be fed back into the system at a later time for processing, thus, taking the place of data.
Data Processing Methods
Data processing can be defined as the methods and procedure through which data is converted to information. This is the manipulation of data, its retention and subsequent retrieval. Data processing methods can be categorized into three:
Manual data processing: this is form of data processing that involves no machine except a desk calculator, pencil, paper and typewriter.
Mechanical or electro-mechanical data processing: this is a form of data processing that involves the use of punched cards equipment.
Electronic data processing: this is the processing of data using electronic computers.
Data Processing Cycle
Data processing steps include: origination, input, processing, storage/output and distribution.
Origination: this is the collection or recording of original data (primary data) to a document called source document which forms the input data.
Input: this describes the data to be fed into the computer (initial data). It is prepared in a convenient or suitable form for processing. This form varies form one data processing device to another. An electromechanical device used the punched cards while an electric computer used magnetic tape, disk, terminals etc. 
Processing: this is plan series of actions and operations performed on data using various data processing devices, to yield data in a more meaningful form called information.
Storage: this is the storage of processed data (information) secondary storage media for future purposes. The two arrows show the storage of information at one time and the information serving as input data at another time.
Output; this is the result of a processed data for the operator. This result could be distributed to the end-users of information or recirculated as input to the processing cycle.
Distribution: this is the distribution of the produced information to the appropriate quarters for decision making. The decision makes are the end users of information.


Data Processing Operations
On a daily basis, there are number of data processing operations carried out by every individual in the course of discharging one assignment or the other. The method of carrying out the operation depends on the method of processing employed. They include:
Recording: this involves the recording of data onto some forms or media. This is, expressing data in a form recognizable by a person or machine.
Duplicating: this is the reproduction of data onto many forms or documents. Backup copies are usually made to help users or operators recover form disaster.
Verifying; this is the process of checking the recorded data for possible errors. This is very important as it offers a means of ensuring the correctness of information before used. 
 Classifying: the process were by data is grouped, arranged or classified into a specific order. Classifying based on unique characteristic of the data items. Example is to classify the students in a class according to gender, age, state of origin, academic level etc.
Sorting; this is the arrangement in a particular order, either in an alphabetical order or numerically at ascending or descending order. That is to arrange classified data serially in alphabetic order last name, or ascending order of registration number. It help to locate records or data easily from a pool of other ones. 
Merging: this is the combination of two or more sets of data into a single one in accordance with a specific rule. Reports in different files can be combined together as one.
Searching: this is the method of locating information in a table or file with reference to a specified field of each record called the key, if the file sorted according to the key, locating a particular record would be very easy.
Retrieval: is a method of finding, locating and extracting a specific data in a record. That is getting a record out of pool of records. To retrieve a record, it is faster if it is sorted the record searched, if found or located can be retrieved. 
Calculating: this deals with performing arithmetic operations on arithmetic data. This is one operation that receives great attention. The bottom-line in any business is profit making and wrong calculation way change credit to debit.
Summarizing/Report writing: this is the method of reporting the result of an operation in a simple and straight form manner. That is presenting information in a contended or highlighted form. Appropriate reports are needed across the different levels of management and must be so produced.
Communication; is the transmitting information and data from one place to another. This operation takes different forms: email, voice mail and any other forms of communication.
Qualities/Values of Information 
 Relevance: information must be relevant to the purpose it is intended to serve. It must be related to the application area, otherwise it is irrelevant to the subject area and decision made based on this cannot be trusted neither can they be reliable.
Completeness: information should contain the last amount of details that is consistent with effective decision making, that is the inclusion of all the relevant information part information is useless (noise) unless it is complete or whole.
Accuracy: information should be sufficiently accurate for the purpose of effective decision making. It must be completely error-fore incorrect information could have damaging consequences.
Clarity: information must be clear to the user for it to be used appropriately. Clarity enhances usability, unclear information is redundant information as it cannot be comprehended nor disseminated.
Appropriateness: the quality of information required varies across the three levels of management (strategic, tactical and operational). The needed quality of information should be sent to the appropriate quarters or levels of management. While more details are needed are the lower level, less retails are required as you move up the pyramid of decisions.
Timeliness: information could be made available where needed a delay in data collection, processing and communication can lead to worth waste of paper. It has to be the right, most current and recent information to be useful.
Frequency: information should be produced at a frequency that is relative to the level of management at operational level-daily: tactical level-monthly; and strategic level-quarterly. 
Source of Information 
The source of information is from both within and outside an organization they are collected through: 
An established collection or measurement system e.g measuring outputs, sales, costs, cash receipts, asset purchases, stock turnover etc.
Information communication between managers and staff e.g. briefings during a meeting.
Published article e.g the research and development work being done by other companies.
Published elicits/legislations
Questionnaires e.g carrying out researches to sample opinions of prospective customers etc.


Advantages of Using Computer
Data processing is virtually the same regardless of the methods of processing(manual, electro-mechanical, electronic). The importance/advantages of using the computer can be summarizes as follows:
Speed: the processing speed of a computers has made it used in information processing inevitable compared to the manual method. Its offer higher productivity when large volumes of job are involved.
 Accuracy: computers are generally accurate where as human are prone to errors. There are three sources of errors: human errors (errors in data input), software errors (errors in programming) and hardware errors (system breakdown). All thing being equal the computer is still desirable.
Volume and capacity: human capability became inadequate with a large volume of jobs. The computer performs routine jobs quickly and efficiently.
Reliability: computer cannot be distracted and can function 24/7
Programmability: computer is well suited for automation of business and scientific applications through programming.
Disadvantages of Using a Computer
Intuition: under certain circumstances human intuition can be very useful and may improve the quality of decisions.
Improvision: human have the capability of reacting to unforeseen situations better than the computer.
Experience: despite recent development in artificial intelligence, human beings have proven to be better in decision making. 
Innovation: human can be very creative and innovative.
Information Processing Organization
Centralized system
This is characterize by the use of large mainframe or mini computers that produce and large the information needs of an organization.in this management, the processing operation is performed by a single host, usually situated in the computer center and user have to go to the center in order to perform their processing needs, or at best, can hae access through remte terminals from various locations.
Decentralized system
Miniaturization in size of computer systems, brought about the development of mini computers and somewhat powerful processing capability, decentralized system emerged. 
Organizations now provide each business unit with an independent computer system. Thus, each unit of an organization has its computer system for information processing and maintains its pertinent information independent of others with this arrangement isolated information is still a problem.
Distributed system
This concept is better explained as an offshoot of a decentralized system. That is a decentralized computing system with information sharing. When the various departments described in a decentralized system are link together, they form a distributed system, with each locality still maintaining its pertinent database. They can still enjoy inter-departmental communication and cooperation, thus enhancing data exchange. A distributed system may be formed by: 
Expanding an existing centralized system by connecting intelligent remote terminals to enhance distributed processing capabilities.
Interconnecting independent decentralized computer.
Information Processing Techniques 
Batch processing
Batch processing is a processing technique where by many individual jobs or tasks (batch of jobs) are executed by a single program loading. With this system, a single program is used to process many jobs sequentially without having to reload each of the jobs after the initial program loading. Batch processing is a method were a system process a group of tasks or data all at once, rather than one at a time. It is commonly used for large scale, repetitive tasks that don’t require immediate result.
An example is the payroll program that lone it is in operation process, the different employees’ pay-slips individually in a single continuous operation. There are variation in a batch processing. One is that a job may be so complex to be executed by a single pass and as such ay be executed or run-in phases with the result of a phase serving as input to the next phase. In this case, the number of program loading will depend solely on the number of phases but that does not change the overall processing techniques from batch processing, since in a phase several jobs are executed.
On-line Processing
On-line processing involves the connection of a computer with user oriented terminals. It is a real time interaction between users and the computer systems to input, process and receive immediate feedback on data. Terminals are under the control of CPU and hence are capable of interacting with the computer. Particularly, online processing has become a household technology used to denote direct connection to the CPU. On-line equipment operates at the same time and in cooperation with the computer to accomplish a task.
Examples of on-line system are: 
A mouse attached to the computer for drawing various objects.
An attached plotter or printer to the computer to produce a graphical or textural output respectively.
A VDU with keyboard for interactive data entry into the computer.
Teleprocessing
Teleprocessing is a term that is generally used to refer to the various computer services using input and output terminals at remote locations form the computer. It refers to the use of tele communications equipment by a computer.
Tele processing has made it possible for people at different locations or part of world to attend conference without leaving their respective homes. That is, the technology allows their terminals to be looked together to the conference center where the seminar or conference is taking place with a special code. 
Teleprocessing involves the following:
Conversational time sharing: where many users at remote locations through their terminals are put under illusion that they here the entire computer to themselves.
Enquiry servicing: where users can make an enquiry at remote locations about airline seat reservation and stock market quotations among other things.
 Data acquisition: where users are able to gather information about the current state of an ongoing process in a timely fashion.
Message switching: where users are under the grise of having a personal switch board. This leads to lower cost of time concentration and communication by store and forward routing of messages.
Real- time Processing
Real-time describes the processing situation where the computer is able to respond to urgent signals realistic in human reaction time, to simulate in real time means to simulate at exactly the real-world time rate. Real time system is used to control an ongoing process and its output made available urgently to effect necessary changes for effective control. An example is the air seat reservation, chemical process control and military warfare.
Time-sharing System
Time-sharing system, from the name implies sharing of the computer time among many users. This term is associated with a computer in a multi-access mode or simultaneous utilization of the computer from multiple terminals.
Let us considered a centralized system having (N) number of users. The computer time is shared among the various users connected to it and giving each of them the impression that the computer is dedicated to his job, but in actual sense the computer services each of them in rotation. This is made possible by the fact that the computer spends a fraction of a second at each work station, does as much work at the apportioned time, shifts to the next station, on and on vatic everyone is being serviced. This system has two basic functions:
To handle communication needs by various users and
To execute the users program
The system is equipped with a clock, which govern the duration of stay at each location. It is also worth nothing that in a time-sharing system all terminals (users) that can be serviced are limited to the number that the communication controller can handle and the central processing speed.


Multiprogramming System
Multiprogramming is a computer technology that refers to concurrent processing of more than one job. In a system having only one processor, multiprogramming is possible by performing an interleave operation of more than one job resident in the memory and whose processor time requirements are diverse. That is, when the processor is bury performing computation on one job, the offer job may be doing input/output (I/O) operations, thus enhancing system throughout and reducing the processor idle time.
Multiprogramming enhances:
Sharing if the processor time 
Sharing of the main store (2 or more jobs reside there)
Potential sharing of other resources.
Therefore, multiprogramming attempts to maximize the efficiency of the computer by keeping busy all the components of the computer.
Multiprocessing System
A multiprocessing system is a computer system that has more than one processor, thus enabling it to process many jobs simultaneously. A multi-processing system houses several jobs in the main memory of the computer just like in multiprogramming but differs from multiprogramming because it processes jobs simultaneously rather than overlapping (interleaving) I/O and computing operation. Therefore, multiprocessing fosters throughput, reliability, parallelism and economy of scale.


Multitasking System 
Multitasking stands for multi programming on single user systems (PC) it afford one person the opportunity of running more than one program concurrently on a single PC, MS windows is a multi-tasking operation system, which allows more than one program to be run concurrently. This is while typing a document in word, access or excel could be used to search some records of client simultaneously.

FILE ORGANIZATION, DATABASES AND DATA WAREHOUSE.
Bits: an acronym for binary digits, 0’s and 1’s which the computer language.
Character: a collection of related bits. It is also said to be a basic unit of information that represents a letter, number, symbol, or control code. It can be a single alphanumeric character like “A” or a special character like “$” or “&”.
Record: A combination of related characters e.g. Name gender etc.
File: A combination of related records with same fields.
Database: this is an integrated collection of related files about a firm, serving as a pool of information for many users. The person who creates and maintains the database is called Database Administrator (DBA). Example, consider a file of employee records containing: Name, Gender, State, and Department.
S/N
NAME
GENDER
STATE
DEPARTMENT

1
Ayodele
M
Kogi
Registry

2
Funmilayo
F
Ogun
Computer

3
Ahmad
M
Kwara
Administration


Software Development Lifecycle (SDLC)
The SDLC refers to a set of activities that constitute the phases of development involved in producing a computerized information system (CIS). That is, the developmental phases involved in producing a computer based system. Generally, the term ‘lifecycle’ is used to cover the birth, life and death of an organism or ecosystem. The SDLC is no exception. A computer system will come into existence, have an interesting life and then dies. In essence, a system can be developed (birth), put to use (live), and becomes obsolete with time and hence replaced (death).
The activities of the lifecycle are constituted into phases of information system development (ISD). They include:
Scope and objectives
Feasibility analysis
Systems design
System implementation and conversion
Post implementation review and maintenance

Scope and objectives
Systems analysis and objectives and design is carried out by a project team, which includes owners, users, designers, builders and analysts. The system owners meet with the project team to describe in clear terms the following issues:

What is desired?
What area of business to investigate (scope)?
What area of system to improve?
In this phase, details about budget, time and available resources are discussed. These details are contained in what is referred to as the “project’s terms of reference”. That is, the scope is well stated and the objectives are clearly defined as they help to define in clear terms the actual task to be done by the team. 
Feasibility study
This is a study carried out prior to a development project to ascertain that the proposed system is feasible and can serve the intended purpose. That is, following the terms of reference in the scope and objectives, an initial investigation of the system is undertaken to find out the details of what is required, its possibility given the available resources viz; time, finance, manpower and technology. The findings are contained in a report called feasibility report.
The purpose of feasibility study is to investigate the present system and evaluate the possible application of a revised or new system. After this, a system is selected tentatively, its cost and effectiveness evaluated, as well as its impact on existing personnel, before it is finally determined if a new or retrained personnel is needed for the system.
It is however necessary to conduct series of tests about each of the alternative systems to establish clearly the benefits and liabilities. The analyst should determine if the solution is feasible or achievable, given the available resources; and determine the technical feasibility and economic feasibility.
System analysis
Is the process of studying the network of interactions within an organization with a view to assisting in the development of new or improved methods of performing the necessary work. System analysis involves the following steps:
Problem definition and classification
Data collection and analysis
Analysis of system’s alternatives
In problem definition and classification, the system’s analyst is charged by the management of an organization to assess the nature of the problem, the possible causes(s) and proffer an effective solution. In this, problem definition and classification requires considerable amount of data to be collected and analyzed to give the analyst a direction of search. Once the problem has been clearly defined, the analyst will have a clear picture and a statement of objective for the remainder of the project.
Data collection and analysis is the next step to collect facts (data) regarding the problem, having clearly defined the problem. This activity is called data collection and data analysis. This analyst will collect the relevant data to the problem; analyze it critically in order to find a lasting solution to the problem. Tools and techniques for data collection and analysis include; system interview, standard flowchart, decision table and questionnaire.
While in analysis of system’s alternatives, the complexity of most systems has made it practically impossible to evolve a solution that is ideal to a major problem. It is pertinent to consider the merits and demerits of several alternatives before selecting anyone as the best. Usually, it is the responsibility of the management to select any alternative considered best among all other options.
System design
This is the development of the actual mechanics for a workable system. At this stage, the analyst focuses attention on the ways and manner in which jobs would be processed on the system. He develop[s specifications for the system’s inputs, outputs and information base. The product of the design phase is a set of operating procedures, computer programs and hardware specifications. Program development comes in after having developed the system on paper (system flowchart), the programmer then takes over to develop the main logic needed to actualize the proposed system. The programmer develops program flowchart, decides on the choice of the programming language, followed by the actual coding and testing of the program.
Design tools for system design include:
Programming: involves translating the design specifications into program codes. This includes design documentations, transmission and report layouts and the specifications for each program.
Prototype system development: pilot or prototype system development involves developing a model for the proposed system, having all its characteristics. In short, it is an abstraction of the real system which can be subjected to series of tests in order to predict the behavior of the proposed system in real life.


System implementation and conversion
Putting a new system in operation is usually a very complicated process due to the fact that the old system is still in use. In some cases, implementation may involve entirely new equipment; that is, changing from a manual system to an automated one or just a change in capacity of the system. Under system implementation, testing of the new or modified is very important. Testing comes in these three ways:
Unit testing: it involves testing each program unit with a view to detecting errors at the elemental units of the system. If each program unit is error-free, then the whole system has a high degree of probability of being free of errors.
System testing: this involves testing the functionality of the entire system. It is one thing for the for the individual units that make the system to function well, it is another that after integration, the whole system will function together as a unit.
Acceptance: this is a final certification that the system is ready for installation and implementation. The test is performed by users and technical staff, and reviewed by management.
Conversion: is the method of changing from one system to another, particularly a new one, is known as conversion. There are several commonly used methods of changing from the old system to the new. These methods are:
Parallel method
Dual method 
Inventory method
Pilot method
Parallel method
This conversion method involves simultaneous operation of both the old and the new system until the new system is certified efficient.
Dual or phase method
This involves gradual phasing out the old system for a new one. Hence, it is known as “gradual change method” or “phasing method”. The cost associated with this is not very high since there is little or no duplication of work, rather, part of the job is done on old while the remaining is done on the new system. 
Inventory method
This is called “Direct” or “Crash” method. It requires one-time conversion from old to new. This approach is carried out on non-business days (weekends). Critics of this method argued that:
It is clear, quick and reasonably inexpensive
It is risky and potentially suicidal
Pilot method
This method involves implementing a small portion of the new system either in parallel or dual method, while a major portion is processed on the old system.
Post-implementation review and maintenance
There is need for post-installation process whereby the analyst examines the performance of the system, compares his findings with the initial expectations and makes recommendations based on these. In post-implementation process, usually a task force is set up to conduct post-implementation to ascertain the operations and effective performance of the implemented system.

  

TOTAL QUALITY MANAGEMENT

TOTAL QUALITY MANAGEMENT, FIRST SEMESTER, 2023/2024 SESSION, 400 LEVEL.
1. GENERAL INTRODUCTION
Quality aspect has grown with humanity. For instance, during the Stone Age period, people used strong stones or sharp weapons to hunt. This shows that they knew the qualitative difference in tools for effectiveness of work. In the same vein, for businesses to satisfy and retain their customers, improving the quality of their products to the taste of their customers is vital.  
A business is likely to make mistakes when it offers what it perceives as value to its customers. Therefore, a business should learn to look at customer value from the perspective of customers in order to meet or exceed customer requirements. 
One popular approach to improving product quality is called, “Total Quality Management (TQM)”. The approach has developed over time and has different meanings at different periods. For instance at a time it was called quality control. Generally, TQM involves the organisation’s long-term commitment to the continuous improvement of quality throughout the organization and with the active participation of all members at all levels, to meet and exceed customer expectations. This top-management-driven philosophy is considered a way of organizational life. In a sense, TQM is efficient and effective management. 
Although the specific programs may vary, they usually require a careful analysis of customer needs, an assessment of the extent to which these needs are currently met, and a plan to fill the possible gap between the current and the desired situation. To make the TQM program effective and efficient, the top managers must be involved by setting a vision, reinforcing organizational values that emphasise product quality improvements, set product quality objectives or goals, and deploy resources for the product quality program. It is also obvious that TQM demands a free flow of information: vertically, horizontally and diagonally.
Training and development is another aspect that is critical in improving product quality. It helps in developing the skills for learning how to use tools as well as techniques such as the statistical product quality control. This continual effort for improving the quality of the organizational products requires that the organization should be a “learning organization:. 
All these product quality improvement efforts by business organizations result from the fact that today’s customers are becoming more powerful, complex and sophisticated.   
2. THE MEANING OF PRODUCT QUALITY AND TOTAL QUALITY MANAGEMENT
Product Quality: In general, quality is a value of things relative to their purpose. Business wide, quality is the usefulness or worth of a product to customers. Thus, product quality can be seen as the totality of features and characteristics of a product that bears on its ability to satisfy stated and implied needs of customers /consumers.
Products that are viewed qualitative by the customers are the ones that meet the specifications and performance standards of the customers and consumers. Traditionally, it was viewed solely as meeting customer specifications. However, the total quality view or the current view of product quality is the ability of the product in satisfying customers’ needs and exceeding their expectations.
 Total Quality Management: The modern TQM is now referred to as, “Business Excellence”. As earlier defined, TQM is the organisation’s long-term commitment to the continuous improvement of quality throughout the organization and with the active participation of all members at all levels, to meet and exceed customer expectations. It is a top-management-driven philosophy that is based on a continuous product quality improvements, productivity and customer satisfaction. It is a philosophy that makes an organization to fully involve all its stakeholders (departments, employees, suppliers, customers etc.) in its quality culture; and all these efforts are customer-driven. 
3. HISTORY OF TQM
Total quality management is the integration of all functions and processes within an organization in order to achieve continuous improvement of the quality of products and services. Deming defined quality as a “never ending cycle of continuous improvement. Juran defined it as fitness for use (purpose). Bearing in mind that TQM is a long term improvement process which requires significant resources. It is also important to realize that TQM is a dynamic process not a static process that is based upon continuous efforts to improve quality in products. Since there are no deadlines or targets to be met, then TQM can never be considered complete which makes it to become a way of life.
 During the early years of manufacturing, inspection was used as a tool to decide if a worker’s job or a product met its requirements. At that time inspection was not done in a systematic way, yet it worked well when the volume of production was low. However, as organizations became larger and more complex, the need for more effective operations became obvious. 
In the early 1900s, “The Father of Scientific Management” Frederick W. Taylor helped to satisfy this need. He proposed a framework for the effective use of people in industrial organizations through his book ‘The Principles of Scientific Management’. One of his concepts was clearly defined tasks performed under standard conditions. Inspection was one of these tasks and was intended to ensure that no faulty product left the workshop or the factory, it also focuses on the product and detection of problem in the product, and testing every item to ensure that the product match the requirements or specifications of the customers. This process is carried out at the end of the production process and requires specially trained inspectors. The need to performing this process was the reason that led to the emergence of a separate inspection department which resulted in the emergence of defect prevention concept which led to quality control. 
The roots of Total Quality Management can be traced back to the 1920s when Dr W. Shewhart developed the application of statistical methods for the management of quality. He demonstrated that variation in the production process leads to the variation in the product, thus by eliminating the variation of the process a good standard of end product can be achieved. The theory of Statistical Quality Control focuses on the product and detection and control of quality problems that involves testing samples and statistically inferring compliance of all products. This process is carried out throughout the production process and requires trained production employees as well as quality control professionals. Towards the end of 1920s the theory was further developed by Dodge, and Romig who developed statistically based acceptance sampling as an alternative to 100% inspection.
In 1940s, the quality guru Deming with his peer co-workers Juran and Feigenbaum continued with the improvement of the theory. However, instead of focusing just on quality of products the concept rapidly widened to involve quality of all issues within an organization i.e. Total Quality Management.
 During the 1950s, many Japanese products experienced low quality and viewed by the world as junk products. Industrial leaders in Japan recognized this problem and decided to produce high quality products by inviting and seeking the help of American quality gurus such as Deming, Juran, and Feigenbaum. Deming suggested that this aim could be achieved within just five years. As a matter of fact not many Japanese believed what Deming claimed. However, they followed his suggestion in order not to lose face and because they respected him. But at last Japan was able to achieve it.
In the late 1950s, quality control management developed rapidly and became the main theme of Japanese management. Interestingly, the idea did not stop at the management level. In the early 60s the concept of the quality control circle was first introduced in Japan by K. Ishikawa. A quality circle is a group of workers who meet and discuss issues to improve all aspects of workplace and make presentations to management with their ideas for improvement. In this way workers were motivated because they felt that they were involved and listened to. Another advantage was the idea of improving not only the quality of products but also all aspects of organizational issues, which probably was the start of Total Quality. The term Total Quality was first used by Feigenbaum at the first international quality control conference in Tokyo in 1969. 
During the 1980s and1990s a new phase of management and quality control began, which became known as Total Quality Management (TQM). Although there are many different definitions for TQM yet the concept is still the same. Nowadays, TQM can be referred to as Business Excellence. The development and success use of TQM in Japan has led to the widespread use of the concept worldwide.
4. A) DIMENSIONS OF PRODUCT QUALITY IN MANUFACURING ORGANIZATIONS
i. Performance: This entails the ability of the product to meet the purpose for which it was purchased to do. It refers to doing the job it is supposed to do as specified by the customer/consumer. Performance has operating characteristics such as speed, comfort, ease of use, etc.
ii. Features: Does the product possess all of the features specified, or required for its intended purpose? Product features are additional characteristics that enhance the appeal of the product to the user. It is often a secondary aspect of performance. They are those characteristics that supplement their basic functioning. An example includes free drinks on a plane.
iii. Reliability: It is the likelihood that the product will perform as expected, free from malfunctioning within a given time period and in different places. It clears the consumer’s doubt on whether the product will consistently perform within specifications and over a desired period of time. The emphasis here is on consistency of the product to perform.
iv. Durability: This means the normal longevity of the product; i.e. getting longer life without much repairs, inconveniences and necessity to replace. Durability measures the length of a product’s life. The item will be used until it is no longer economical to operate it. This happens when the repair rate and the associated costs increase significantly. Technically, durability can be defined as the amount of use one gets from a product before it deteriorates.  
v. Conformance: It is the degree to which the product satisfies or conforms to pre-established standards. It answers on whether the product conforms to the specifications of the customer.
If it is developed based on a performance specification, does it perform as specified? Conformance therefore, refers to the precision with which the product meets the specified standards.
vi. Service ability: This includes the speed, ease, and convenience of getting or making maintenance work or repairs and the courtesy and competency of service employees. As end users become more focused on Total Cost of Ownership than simple procurement costs, serviceability (as well as reliability) is becoming an increasingly important dimension of quality and criteria for product selection.
vii. Reputation or perceived quality: Reputation refers to the past performance of a company’s product. While perceived quality, is the quality attributed to a good or service based on indirect measures. They are subjective opinions about the product based on images or attitudes formed by advertising and/or the reputation of the producer
viii. Aesthetics: The aesthetic properties of a product refer to its appearance. It involves how the product looks, feels, sounds, tastes, or smells. It is clearly a matter of personal judgment and a reflection of individual preference. 
B) DIMENSIONS OF PRODUCT QUALITY IN SERVICE ORGANIZATIONS
 i) Reliability: Consistency of the satisfaction to be derived from the service provider.
ii) Accessibility and convenience: Ease of obtaining the service (is the service easy to obtain?)
iii) Timeliness: Will a service be performed at the promised time?
iv) Completeness: Are all items in the order included?
v) Tangibility: The extent to which their tangible facilities are good.
vi) Empathy or Courtesy: Mutual respect for customers 
vii) Responsiveness: Prompt response to customers stated and unstated needs as well as their complaints.
5. BENEFITS OF PRODUCT QUALITY 
The following are some of the benefits to be derived by profit-making organisations from adhering to product quality.
i. The image of the organization will be high: The improved reputation can result in making the
business to be a major player in the industry.
ii. For consistency in product quality, market share will increase and consequently sales and
profits will be higher.
iii. Due to improved quality, the re-work and delays are avoided or reduced. This reduces the manufacturing cost.
iv. Productivity will increase due to reduced re-work and lesser inspectors’ time spent.
v. Employees’ morale at all levels will be high. This will improve work atmosphere and lead to better team work.   
vi. Adhering to product quality use to ease the introduction of new product lines because of the reputation earned previously by the business through its qualitative products. 
vii. Customer retention: Product quality can help in keeping committed customers/consumer
6. CAUSES OF PRODUCTS QUALITY FAILURE
The following are among the issues that can cause product quality failure or can result to low quality products in an organization depending on how the organization handles them.
i. Manpower Related: Human error can happen due to fatigue, poor eye sight, hearing and movement, inadequate training and re-training as well as poor knowledge of the process. Lack of supervision, frequent changes, transfers and absentees and too management’s lesser attention to product quality.
ii. Facilities Related: To get a particular standard of quality, the machinery, tools and fixtures, as well as measuring instruments should have the accuracies. The machineries should be good enough to give consistency of quality in all the productions. Worn-out tools, deteriorating machines, are to be reconditioned periodically or replaced. Preventive maintenance should be directed to all the basic and important machines. 
iii. Process Related: If the designed and adopted processes of production were not tested before subjecting them to mass production or were not easy to operate, it may affect the quality of products produced.
iv. Raw Materials Related: This has to do with the raw materials for the production of the goods. Quality of the products can be affected if the supplied raw materials are not in a good condition. Businesses should therefore, make sure that they supply the right and correct raw materials that will support the production of quality products.   
v. Conduciveness of Production Environment Related:   
7. REASONS FOR MANAGEMENT EFFORTS FAILURES IN PRODUCT QUALITY MANAGEMENT
Both employees and organisations are aware of the importance of producing quality products for the survival and growth of their businesses. Despite this, there are still organisations that are not adhering to the notion of quality products or achieving success in quality improvement. Experts feel that it may be due to the adherence of the production concept or philosophy that whatsoever is produced is sold which has since challenged during Elton Mayor. As earlier stated, product quality should be top management-driven. Some of the reasons for product quality culture failures are as follows:
i. Targets and financial turnover are given more emphasis by the management of some organisations. This leads to little slackness in quality of products.
ii. Most often, managers of some organisations look for short-term gains and try to push products in a hurry.
iii. Manpower selection, not well suited to products and processes.
iv. Inability of managers to provide the required facilities which is not helping in getting the desired result or achieving the organizational objectives. Quality improvement programs require some investments in facilities, training, and R&D work. Some managers have a wrong notion that quality improvement costs too high, hence fail to provide funds for these; which use to create problems. Such managers are to be made to understand that investment pays in the long run in terms of customer satisfaction and hence high sales and profits. 
8. COST OF QUALITY
Cost of Quality can be seen as the process that measures and determine where and how the resources of organizations are utilized for, in maintenance of quality and prevention of delivering poor outputs. In other words Cost of quality is a method for calculating the costs companies incur in ensuring that products meet quality standards, as well as the costs of producing goods that fail to meet quality standards. It talks about the costs that the organization bears while trying to achieve and maintain the quality output. 
The Cost of Quality can be represented by the sum of two factors. The Cost of Good Quality (CoGQ) and the Cost of Poor Quality (CoPQ), as represented in the basic equation below:
CoQ = CoGQ + CoPQ
The Cost of Quality equation looks simple but in reality it is more complex. The Cost of Quality includes all costs associated with the quality of a product from preventive costs intended to reduce or eliminate failures, cost of process controls to maintain quality levels and the costs related to failures both internal and external.
Cost of quality has 4 basic components categorized under two main components as below:
A) Cost of Good Quality (CoGQ)
1. Prevention costs. These are the costs incurred in preventing mistakes.
You incur a prevention cost in order to keep a quality problem from occurring. It is the least expensive type of quality cost, and so is highly recommended. Prevention costs can include quality planning, market research, proper employee training, supplier certification and a robust product design. A focus on prevention tends to reduce scrap costs, because the scrap occurrence reduces. Expenditure in this area is seen as an investment.
2. Appraisal costs. These include the costs incurred in order to maintain acceptable product quality levels. Appraisal costs can include, but are not limited to, the following: costs of inspecting products, Incoming Material Inspections, Process Controls, and Supplier Assessments.
B) Cost of Poor Quality (CoPQ)
3. Internal failure costs. These are the costs incurred when a product does not conform to quality standards. Examples include the costs of scrap, repairs to defective products, Waste due to poorly designed processes
4. External failure costs. These are the costs incurred when the customer receives a poor quality product. Examples include the costs of investigating complaints, of replacing products returned by the customer, and warranty charges, and the cost of losing a dissatisfied customer that change his loyalty to a new supplier. 



9. PRINCIPLES OF TOTAL QUALITY MANAGEMENT (TQM) 
With increased competition and market globalization, Total Quality Management principles and practices are now becoming more and more important for the leadership and management of any organization. Below are the basic principles of TQM that are practiced worldwide. 
i. Customer Focus: Customer focus is perhaps the most important principle of TQM among other principles. This principle stresses that an organization should understand its customers in terms of what they need and when they need it while trying to meet and exceed their expectations. As such profit will increase as a result of an increasing pool of loyal customers lured to the business, from its customer oriented activities.
 ii. Leadership: Good leaders help to unite an organization and give employees a sense of direction. They create and nurture an environment where everyone’s views are given careful consideration in charting a course to the organization’s objectives. Organizations succeed when leaders establish and maintain the internal environment in which employees can become fully involved in achieving the organization’s unified objectives. Therefore, without effective leadership, an organization loses its direction. This principle establishes that leaders are fundamental in setting clear goals and objectives and ensuring that employees are actively involved in achieving these objectives.
iii. Total Employee Involvement: This is where each and every employee’s effort or contribution to the organization is required or is viewed important. It requires that all the employees are to participate in working toward common goals. The idea is that, if employees were made to feel that their ones effort or contribution is important or can contribute to the achievement of the overall organizational goals and objectives, they become innovative, creative and eager to participate in achieving the organizational objectives. Hence, this principle helps in creating a commonality of purpose in the organization.
iv. Continual Improvement: Continuous improvement is part of the management of all systems and processes. Achieving the highest level of product quality requires a well-defined and well-executed approach to continuous improvement and learning. Continual improvement drives an organization to be both analytical and creative in finding ways of becoming more competitive and more effective at meeting stakeholders’ expectations. This continues improvement will lead to improved and higher quality processes.
v. Process-centered: A fundamental part of TQM is a focus on process thinking. A process is a series of steps that take inputs from suppliers (internal or external) and transforms them into outputs that are delivered to customers (internal or external). The steps required to carry out the production are defined, and performance measures are continuously monitored in order to detect unexpected variation.
vi. Strategic Approach to Improvement: Businesses must adopt a strategic approach towards quality improvement to achieve their goals, mission and vision. A strategic plan is very necessary to ensure quality becomes the core aspect of all business processes.
vii. System Approach to Management: Organizations sustain success when processes are managed as one coherent quality management system. This involves identifying, understanding and managing interrelated processes to enable the sub-systems of the organization contribute to the organisation’s effectiveness and efficiency which can help in achieving the objectives of the organization as the larger system. Therefore, this principle stresses that several processes are managed simultaneously in an organized system. This makes the system much more effective and greater than the sum of its individual parts.
viii. Decision-making based on facts: Decision-making within the organization should be based on facts and not on opinions (emotions and personal interests). Data should support this decision-making process.
10. TOTAL QUALITY MANAGEMENT GURUS AND THEIR CONTRIBUTIONS
To fully understand and appreciate the TQM movement, we need to look at the philosophies of notable individuals who have shaped the evolution and development of TQM. Their philosophies and teachings have contributed to our knowledge and understanding of TQM today. Below are the notable ones:
1. Walter Shewhart: He was an USA Professor of Statistics. He contributed to the understanding of “process variability” by developing a “Statistical Quality Control Chart”. Shewhart studied randomness and recognized that variability existed in all manufacturing processes. He developed Quality Control Charts that are used to identify whether the variability in the process is random or due to an assignable cause, such as poor workers or ineffective machinery. He stressed that eliminating variability improves quality. Professor Shewhart was the first person to advocate the use of Statistical Quality Control method in quality evaluation. His work created the foundation for today’s process control and he is often referred to as the “grandfather” of quality control.  
2. W. Edwards Deming: Deming is often referred to as the “father of Quality control”. He was a Statistics Professor at New York University in the 1940s. After World War II, he assisted many Japanese companies in improving quality. The Japanese regarded him so high that in 1951they establish the Deming prize: an annual award given to firms that demonstrate outstanding product quality. It was about 30 years later that American businesses began to adopt Deming’s philosophy. A number of elements of Deming’s philosophy depart from traditional notions of quality. The first is the role management should play in a company’s quality improvement effort. Traditionally, poor quality was blamed on workers- on their lack of productivity, laziness or carelessness. However, Deming pointed out that only 15 percent of quality problems are actually due to worker error. The remaining 85 percent are caused by processes and systems, including poor management. Deming said that it is up to management to correct system problems and create an environment that promotes quality and enables workers to achieve their full potentials. He believed that managers should drive out any fear employees have on identifying quality problems. Proper methods should be taught and detecting and eliminating poor quality should be everyone’s responsibility.
Deming outlined his philosophy on quality in his famous “14 points”. These points are principles that help and guide companies in achieving quality improvement. The principles are founded on the idea that top level management must develop a commitment to quality and provide a system to support this commitment that involves all employees and suppliers. Deming stressed that quality improvements cannot be achieved without organizational change that comes from the top level management.
3. Joseph M. Juran: After W. Edward Deming, Dr Joseph Juran is considered to have had the greatest impact on quality management. Juran initially worked in the quality programme at Western Electric. He became better known in 1953, after the publication of his book, “Quality Control Handbook”. In 1954 he went to Japan to work with manufacturers and teach classes on quality. One of Juran’s significant contributions is his focus on the definition of quality and the cost of quality. Juran is credited with defining quality as, “fitness for use” rather than simply conformance to specifications. According to Juran, fitness for use takes into account customer intentions for use of the product instead of only focusing on technical specifications. Juran was also credited with developing the concept of cost of quality, which allows measuring quality in monetary terms rather than on the basis of subjective evaluations. Consider the following table for the main contributions of these 3 contributors:
Quality Guru
Main Contribution

Walter Shewhart
i. Contributed to understanding of process variability.
ii. Developed concept of Statistical Control Charts.

W. Edwards Deming
i. Stressed management’s responsibility for quality
ii. Developed “14 points” to guide companies in quality improvement.

Joseph M. Juran
i. Defined quality as “fitness for use”.
ii. Developed concept of cost of quality

Other contributors include: Armand V. Fegenbaum, Philip B. Crosby, Kaoru Ishikawa and Genichi Taguchi.
11. STATISTICAL QUALITY CONTROL (SQC)
Statistical Quality Control refers to the use of statistical methods in the monitoring and maintaining of the quality of products. Its main objective is to achieve quality in production through the use of adequate statistical techniques.
SQC began a long time ago, when manufacturing began, and competition accompanied it, with consumers comparing and choosing the most attractive products. The Industrial Revolution, led producers to the need of developing methods for the control of their manufactured products. During the industrial revolution, SQC was comparatively new; its greatest developments have taken place during the 20th century. In 1924, at the Bell Laboratories, Shewhart developed the concept of control chart and, more generally, statistical process control (SPC), shifting the attention from the product to the production process. Dodge and Romig (1959), also in the Bell Laboratories, developed sampling inspection, as an alternative to the 100% inspection. Among the pioneers of SPC we also distinguish W.E. Deming, J.M. Juran, and P.B. Crosby. But it was during the Second World War that there was a generalized use and acceptance of SQC, largely used in USA and considered as primordial for the defeat of Japan. In 1946, the American Society for Quality Control was founded, and this enabled a huge push to the generalization and improvement of SQC methods.
Good Luck.

INNOVATION MANAGEMENT

OVERVIEW OF TECHNOLOGY & INNOVATION MANAGEMENT
Technology and Innovation plays a crucial role in fostering economic growth of nations and enhancing industrial competitiveness, through its domineering influence over industrial productivity as well as national security. As a result technological innovation has always been intertwined with society’s progress but never in history has technology been so visibly linked to improvements in standard of living. For example, the earlier civilizations were classified by the technologies they used, such as Stone Age, Bronze Age, and Iron Age, while more recent periods have been labeled the Steam Age, Electricity Age, Nuclear Age, Electronic Age, Space Age, Information Age and Biotechnology Age, all focusing on the most rapidly advancing technology of their time (Thamhain, 2005, p.25). It is not by accident that the richest people (Bill Gate, Jack Ma, Mark Zuckerberg, Jeff Bezos, Larry Page & Sergey Brin and Jerry Yang & David Filo) and Firms (Microsoft, Alibaba, Facebook, Amazon, Google and Yahoo) are those that significantly leverage technology and innovation. Hence, the importance of managing technology and innovation cannot be over-emphasized, as national and corporate success in today’s environment depends to a large extent on the ability to manage technology and innovation. Aunger (2010, p.762) captured this importance and call for better understanding of technology:
Many historians suggest that technology is the driving force in history. This claim has become so prevalent that it’s recognized as a doctrine, called ‘technological determinism’. Technological superiority is what allows certain groups to conquer or subjugate others, and so expand their domain of influence… is also what separates us (humans) from every other creature on Earth. After all, the best chimpanzees can do on this front is to use small stones to break nuts open on large stones, whereas we (humans) build skyscrapers to the moon.
   Therefore, deep understanding of the capabilities of technology and innovation management is crucial, due to the critical role they play in the creation and execution of corporate (and national) strategy (Sahlman, 2010).
2.0 TECHNOLOGY
Technology have been used by authors to mirror many things such as types of products, manufacturing, information, capabilities, value chain processes, competitive advantage, and/or as an outcome of research or innovation (Sahlman, 2010, p.38). As a result a universally accepted definition of technology is conceptualized to take two major distinct but overlapping meaning i.e. technology (as a) resource and technology (as a) product (Ahmad & Ahmad, 2006). In general technology consists of two primary components: 
A Physical Component which comprises of items such as products, tooling, equipment, blueprints, techniques, and processes; and 
The Informational Component which consists of know-how in management, marketing, production, quality control, reliability, skilled labor and functional areas (Kumar, Kumar & Persaud, 1999).
2.1 Technology Change: There are two (2) distinct types of technology change i.e. continuous/incremental on one hand discontinuous/breakthrough. In general, four important variables (speed of the change; degree to which the change is noticeable; impact of the change; and the identification of the inventor) form the basis for discriminating between these two types of technological change as detailed in table 1.
Table 1: Continuous/Incremental vs. Discontinuous/Breakthrough Technology Change
Categories
Continuous/Incremental
Discontinuous/Breakthrough

Speed
Slow, Gradual, Plodding Change
Fast, Speedy, Swift Change

Notice
Imperceptible, Unremarkable, Rarely Notice Change
Easily Noticeable & Remarkable Change

Impact
Improves Existing Product and/or process
Create a new product and/or process

Inventor 
No Recognizable author/inventor 
Have a recognizable author/inventor

Hence, continuous/incremental technology change refers to the gradual, often indiscernible technology flows that improve existing products or processes, which are not attributed to a single inventor. While discontinuous/breakthrough technology change, in contrast, involves revolutionary technological advances due to the invention of new products or process, and quite often are attributed to a particular inventor  
The Transition from four wooden-legs black and white TV to (now old-fashioned) colour TV (see Figures 1 & 2) fits into the category of discontinuous change because it was easily noticeable (physically and functionally), for the first time it enables viewing with full colour separation (impact) and have a recognizable inventor (Polish inventor Jan Szczepanik patented a colour television system in 1897). The shift from LCD to LED (as depicted in figure 3) TV is however a classical example for continuous technology change based on the aforementioned four variables.





Figure 1: Four wooden legs Black & White TV Figure 2: Old-Fashioned Colour TV
It is important to note however that clear distinction between the two types of technology changes is largely limited to short time observation. In the long run, the consensus of scholars is that the two are mutually inclusive, that is, to say technological change is characterized by long period of incremental change punctuated by technological breakthrough (Zayglidoupolous, 1999).







Thus the overall shift in TV technology in the long run encompasses both discontinuous (black & white to colour) and continuous (LCD to LED) changes.
2.2 Degree of Technology Advancement
Khalil (2000) identified three (3) categories of industries in relation to its technology usage, however, only the two extreme categories are presented in table 2.
Variables
High Tech Industry
Low Tech Industry

 Speed
Highly Skilled & Educated
Un and/or Semi-skilled

Notice
Technology-based
Manual/Semi-automatic

Impact
Breakthrough/discontinuous
Incremental/continuous

Inventor 
High  
Low and/or non-existing

Output
High Tech/Complex
Stable low/non-tech products


2.3 Perspectives of Technology
There are two major perspectives to viewing technology from management perspective i.e technology resource and/or technology product (Ahmad & Ahmad, 2006). Meaning, sub-types and similarities between the aforementioned types of technologies are discussed in this section.
1. Technology Resource
 Meaning: Technology resource is described as the codifiable and non-codifiable information and knowledge that is embedded partly in the manuals and standard practice, partly in the machinery and equipment, and partly in the people and social organization of a particular organization (Zayglidoupolus, 1999). From the definition two important conclusions can be deduce; first, technology resource has three (3) elements. Thus, technology resource encompasses three (3) of the six (6) overall management resources excluding money, materials and market. Measuring the stock and effectiveness of a firm technology therefore entails among other thing the evaluation of the quantity, quality, sophistication and variety of its Machine as well as manpower, which invariably determine the range and scope of methods achievable. Second, all organizations (profit, non-profit, governmental, small, large, new or old etc.) require technology resource to survive.
Types of Technology Resources
 There are basically two main types of technology resources based on organizational requirement; Effectual and Ineffectual technologies as described in figure 4.
Effectual Technologies add value to firms. Effectual technologies are further sub-divided into core/key and auxiliary technologies in line with their importance within an organization, department, unit, nation etc. Core/Key Technologies are technologies of paramount importance to organizations; this is because absence and/or failure of these technologies may result in immediate disruption of the normal organizational. Hence, organizations rely on key technologies for their subsistence. Example of core/key technologies: machine – air craft to an airline firm; manpower – pilot to an airliner; and method – account update by banks. Auxiliary/Supporting Technologies on the other hand are technologies that organizations need in order to outperform competitors and generally improves their back-office and front-line operations. Examples include machinery-satellite TV in patients’ room and manpower – security personnel in a hospital. Absence of auxiliary technologies simply means poor competitive position and does not necessarily lead to immediate disruption of corporate operations. In many industries however key technologies are taken for granted, take the airline industry for example; - competition is on the basis of auxiliary technologies such as on-board TV, games and radio stations. Based on the foregoing it is recommended for firms to always have back-ups readily available for core/key technologies in case of unforeseen circumstances.
  b. Ineffectual Technologies on the other hand are technologies that do not add value to an organization. In fact they deplete organizational value. From the figure 1 there are three main dimensions of ineffectualism i.e. quantity, quality and variety. One of the forms of ineffectualism is the acquisition of irrelevant technologies, or technologies beyond the quantitative and/or qualitative requirement of a firm (see figure 4). Ineffectualism is a serious strategic issue to organizations as it drains organizational resources and consumes space. Despite these however, few organizations are completely immune from ineffectualism, this is due to among other things corrupt induced over stocking of effectual technologies, poor maintenance culture, sudden change in consumers taste, rapid invention of new technologies and bandwagon effect.







Figure 4: Types of Technology Resource

2. Technology Product
 Meaning: Technology products are unique corporate offers to the market that are use not ‘consume’. In essence, consumption of technology product does not lead to its depletion, unlike non-technology products. Examples include computer, software, television, satellite signals, mobile phones, and GSM service (Ahmad & Ahmad, 2006). Two (2) facts are also deductible from the definition of technology product. First, it has two major types, tangible (TV, Laptop, Car) and intangible (Software, GSM service, satellite) second, not all organization produce technology product.
Similarities and Dissimilarities between Technology Resources & Technology Product
In terms of similarities, machine is central to both technology resource and product. Both also share the concept technology in literature. Similarly, technology resources are used to produce technology products. On the other side, while technology resource represent an input (i.e. what firms require) technology are output (what comes out of the firms). Finally, while all organizations require technology resources to exist, only some offer technology products (as many organizations are producing of non-technology products).
 Classification of Technology Product 
Technology products are often based on six (6) overlapping perspective; sophistication, market place effect, life cycle, contact nature, tangibility and motivation as illustrated in figure 5.
a. Sophistication: Low technologies are simple and stable technologies that are easily produced, such as plates, bucket, chairs, etc. low technology products may be exclusively produced from a single or few materials(s)/substance (s). Low techs are also standalone machines and can be operated without prior formal training. High technologies are complex and unstable technologies that are rarely utilized in isolation and are made from multitude of any complex technologies. They are also rarely used without prior training or with accumulated knowledge. Aircrafts, computers, and handsets are examples of high tech. Medium technologies such as bicycle, wheelbarrow shares the features of both high and low techs.
b. Market Place Effect
Convergent technologies (also known as packet technologies) are technologies that perform the function of two or more different technologies. Four-in-one equipment is a typical example that enables; printing, faxing, photocopying and scanning. Convergent technologies offer important benefits to consumers such as less space requirement, less fatigue and overall cost savings. There are two main types of convergent technologies; simple and complex where the multiple functions performed are related or otherwise respectively. Gartenberg (2002) argued that, convergent technology may compromise functionality and results to higher total cost of ownership, he also argued that their proliferation is mainly influenced by technology vendors technical capacity rather than market need. However, virtually all new and incremental technologies offer one or more elements of convergence. Disruptive technologies are simple, convenient-to-use innovations that are initially used by only the unsophisticated customers at the low end of the market. At the time of their inventions, their impacts were seen as only incremental, if not inconsequential. At the time of their release disruptive technologies are normally inferior and cheaper compared to the (incremental and/or sustaining) technologies they displaced. Disruptive technologies are however, very difficult to deal with. Because their low initial profit makes them less fashionable to the established firms, their inferior nature gives them ample improvements capacity and their simplicity of usage and low cost engenders ease of trials. Disruptive business model, either create new market or take the low end of an established market; the first type creates a new market by targeting non-consumers; the second competes in the low end of an established market.
c. Life Cycle 
New and emerging technologies are technologies that enable users to perform new function(s) or a different albeit better ways of performing an old function. New technologies can be further subdivided into first beneath the sky, first beneath the nation and first beneath the company. Not all new technologies matured because of high R & D improvement, lack of loyalty and ease of piracy. Incremental technologies (also referred to as sustaining technologies), usually improves a key parameter of an efficiency, quality, capacity, reduces error rate and portability. Incremental technologies therefore present a new and better way of performing an old function incremental technologies are also further divided according to the intensity of the improvement i.e. minor or radical improvement. Incremental technologies obviously have lesser acceptance problems and R&d expenditure compared to new technologies.



Deliberate incremental effort usually target safety, compatibility, resistant to extreme circumstances, functional and physical upgrade among others. Disappearing Technologies are technologies that are steadily fading from the market, because of the inventions of new/improved technologies and perhaps changes in consumers’ taste. These technologies become virtually useless because the new and improved technologies dominate them in all facets of customer value assessment. Some disappearing technologies are submerged quickly and unnoticeable, as they are completely dominated in virtually all area of customer’s value assessment. Others disappearing slowly but steadily (partial dominance). Flash over floppy disks present a very good example of complete dominance as the former completely denominated the latter in the areas of portability, capacity, durability, compatibility and cost-benefit. No matter the nature of disappearance however, monitoring of disappearanring technologies, should form part of technology strategy in order to avoid last minute expensive changes.
d. Contact Nature: Continuous contact technology products require unending relationship between technology firms and their clients. Most intangible technologies such as telecom service providers require continuous supply of tech product to enable incessant relationship with clients. Discontinuous contact technology products only require intermittent relationship with the technology firms, e.g. Software-periodic update and Motor vehicle-maintenance. Yet another category is one-off contact e.g. buying a spoon, sharpener, syringe and needle.
e. Tangibility: Tangibility product can be either tangible such as car, refrigerator, and watch or intangible such as GSM service, satellite signal or software. For tangible technologies such as laptop, mobile and aircraft their physically is a pre requisite for their functionality. Not so for intangible technologies where functionality not tied to their physicality for example we routinely used satellite signals despite not being able to see, feel or touch it.
f. Motivation: Technology push occurs when new opportunities arising out of research gave rise to new technologies, applications and refinements which eventually found their way to the market place. Market Pull technologies on the other hand are technological products whose motivation is necessitated by the unfulfilled market needs.











  







Figure 5: Classification of Technology Product











3.0 INNOVATION
Invention & Discoveries (I&D) are the starting point of innovation process (Burgelman, Christensen & Wheelwright, 2004 p.2). The bottom line for innovation is the market, which will buy it or ignore it, thereby determining success or failure. The term innovation comes from latin’s word innovare, which means ‘to make something new’ (Amidon, 2003; Tidd, Bessant, & Pavitt, 2005). Innovation in its multiple dimensions (products, process, marketing, original, technological, etc.) is a key success requirement in today’s business environment (Hamel, 2001).
3.1 Origin & Meaning
We discover what existed though unknown to us, while we invent what never existed before. Hence we discover islands/natural resources and invent machines/devices. Invention and Discovery (I&D) are the result of creative processes that are often serendipitous and very difficult to predict or plan. The success criteria regarding I&D are technical (It is true/real?) rather than commercial. However, through patents, I&D sometimes allow their originators to establish a potential for economic gain with subsequent innovations. Rogers (2003) suggest that innovations are the commercialization of inventions (idea-to-cash), in simple terms: Innovation = invention + commercialization
Not all I&D graduate to innovations, I&D with economic potentials, solve societal problems or; increases the wellbeing of the society are the target of innovators. Innovation therefore refers to the successful commercialization of I&D. Unlike I&D the criteria for success in innovation are commercial rather than technical. A successful innovation is one that returns the original investment in its development plus some additional returns. The innovation process involves integration of existing technology and inventions to create a new product, process, or system. Hence, innovation represents the important connection between an idea and its exploitation or commercialization. In a world that is changing so fast what companies’ need is not (necessarily) the best practice but a new practice, as such greatest rewards go to companies that create new business model (Hamel, 2001). Wealth according Hamel (2001) is created with mind not (only) resources.
3.2 Phases of Innovation
Traditional economists Schumpeter (1939), Barthwal (2007) and Mariano (2004) break down the process of technology innovation into a sequence consisting of three phases which are discussed below:
a. Invention: Invention is the creation of a novel technology (idea, machine or process), such as the steam engine, the transistor, and the Xerox machine. Inventions occur as a result of human ingenuity and imagination. They occur only sporadically, sometimes happening by chance or through trial and error other times via a formal scientific endeavor. There is usually a lag-time between scientific discoveries and inventions. It may take years to convert science into technology; it may take more years to move an invention to the market as a product or a service. Even though many inventions are generated by creative people and many of them patented only few reach the market place and fewer have lasting impact.
b. Innovation: Innovation represents the important connection between an idea and its exploitation or commercialization. The button line for innovation is the market, which will buy it or ignore it, thereby determining success or failure. In a world that is changing so fast what companies need is not necessarily the best practice but a new practice, as such greatest rewards go to companies that create new business model (Hamel, 2001).
c. Diffusion: Diffusion is defined as the process by which an innovation is adopted and gains acceptance by members of a certain community. A number of factors interact to influence the diffusion of an innovation. The four major factors are the innovation itself; how information about the innovations is communicated; time and the nature of the social system into which the innovation is being introduced (Rogers, 1995).
3.3 Types of innovations
An enterprises can make many types of changes in its methods of work its use factors of production and the types of output that improve its productivity or commercial performance. Studies have identified four types of innovations as follows products innovations, process innovations, marketing innovations and organizational innovations (OECD, 2005).
a. Product Innovation: Is the introduction of a good or service that is new or significantly improved with respect to its characteristics or intended uses. It includes significant improvements in technical specifications, components and materials, incorporated software, user friendliness or other functional characteristics. New products are goods and services that differ significantly in their characteristics or intended uses from products previously produced by the firm. Product innovations related to goods includes products with significantly reduced energy consumption and significant changes in products to meet environment standard and so on.
b. Process Innovation: Is the implementation of a new or significantly improved production or delivery method for the creation and provision of services. It includes significant changes in the equipment and or in the procedures or techniques that are employed to deliver services. It intends to decrease unit costs of production or delivery to increase quality. Production methods involve the techniques, equipment and software used to produce goods or services including installation of new or improved manufacturing technology such as automation equipment, computerized equipment for quality control of production and improved testing equipment for monitoring production. 
c. Marketing Innovation: Is the implementation of a new marketing method involving significant changes in product design, packaging, placement, promotion and pricing etc to market. It’s aimed at better addressing customer needs, opening up new markets, or newly positioning a firm’s product on the market and finally intends to increase the firm’s sales.
d. Organizational Innovation: Is the implementation of a new organizational method in the firm’s business practices, workplace organization or external relations. It intended to increase a firm’s performance by reducing administrative costs or transaction cost, improving workplace satisfaction, reducing cost of supplies. In business practices, it involve the implementation of new methods for organizing routines and procedures for conduct of work, implementation of new practices to improve learning and knowledge sharing within the firm and other knowledge to make more easily accessible to others.  

4.0 INNOVATION MANAGEMENT
In a world that is changing so fast what companies’ need is not (necessarily) the best practice but a new practice, as such greatest rewards go to companies that create new business model (Hamel, 2001). Wealth according Hamel (2001) is now created with mind not resources.
4.1 Drivers of Innovation
We can conveniently classify drivers of innovation activities especially from organizational perspective into two broad areas indirect otherwise known as external (opportunities) and direct otherwise known as internal (ability to exploit the opportunities) factors. As the names suggest direct drivers are largely under the direct control and manipulation of an organization, in other words firms have the power to make decisions on these factors (Yin, 1985). While indirect drivers are not under direct control even though organizations may try to influence them via proactive strategies such as technology foresight but by their nature they cannot lend themselves to direct control. While there are avalanche of literature on the drivers, few are reviewed below: (Yin, 1985, p.19) based on the aforementioned, internal conceptualization include personnel quality/structure, qualitative leadership, effective idea management, organizational form/cooperation, scale of unit (size); while external drivers include evolving technologies, changing market characteristics/incentives; environmental conditions, collaboration and the role of government. However, only two external (evolving technologies & changing market characteristics/incentives) and two internal (qualitative leadership and effective idea management process) drivers are discussed below:
a. Evolving Technologies: Successful innovations rest on new, improved, incremental or disruptive technologies, for example Ted Turner wed two technological developments – the shoulder-head minicam and more affordable access to satellite transmission to innovate the concept of a continuous news format. Tele-surgery becomes realty after the existence of fibre-optic cable, high speed internet link and robotic arm technology. Similarly, e-business, e-banking, virtual banking and m- commerce innovations are only possible with developments and improvements in a number of technologies.
b. Changing Market Characteristics: In addition to evolving technologies successful innovators capitalize on changing market characteristics such as changes in customers taste, life style, climatic conditions, per capital income, sophistication, national infrastructures etc. for example in addition to minicam and satellite technology. Turner also capitalize on people’s busy life style making it difficult for people to always be at home for the news hour e.g. the six o’clock news in US or the nine o’clock news in Nigeria to innovate the concept of a continuous news format. 
c. Qualitative Leadership: Accordingly every single employee deserves the same potential as the most senior person in an organization to influence the destiny of the organization to which they are devoting their lives. Hamel identified three (3) kinds of authority within organizations. I.e. positional, intellectual, and moral. Senior executives are pretty good at wielding positional power but are not always good at wielding intellectual power and moral authority, while we live in a world where it is not clear that people at the top of the organization are smarter than the ones below (Hamel, 2003). Mintzberg (1996) on the other hand observed that there are two types of innovators in organization i.e. rare talented employees that can see the world we can’t’ and those that can get extraordinary performance from average employees. The task of leaders is to allow all categories of employees exhibit their skills; in essence autocratic leadership styles should be eschewed. Hence, equity, team effort, circle/flat organization, distributed authorities, avoid excess workload, additional % of budget to radical ideas/bonuses, simple design, reduces corporate monopoly on funds allocation, tolerate and reward mistakes, only punish inaction should be pursued to institutionalize innovation.  
d. Effective Idea Management: Effective idea management process via idea collection/generation through customers, employees, consultants, specialists, competitors, observation, benchmarking, supplies and market intelligence. Keeping ideas (corporate memories) database, to keep and shared ideas not only in text, but pictures, diagram, animation etc. using types of database (such as Distributed processing/distributed database; Object-Oriented Database; On-line Database; and Multidimensional Database), i.e. cross-fertilization of ideas (idea box all variety of ideas are kept in the box and shared via organizational intranet the memories in the Tech Boxes would eventually die if designers didn’t constantly look at the stuff, play with it, and use it in their work). Companies lose what they learn when people leave hence qualitative staff should be motivated to remain. Geographic distance, political, squabbles, internal competition, and bad incentive systems of in may hinder the spread of ideas. Selecting and implementing ideas, is the final stage of idea management. It entails among others evaluating the viability, feasibility, and practicability of the idea.
Conclusively organizations only leverage innovation when they create conducive internal environment via qualitative leadership and idea management process to take the advantage of evolving technologies as well as changing market characteristics.
4.2 Barriers to Innovation
Due to the great contribution of the innovative activities to the enterprises competitiveness and success, it is of great interest to identify the barriers and obstacles that limit the development of innovative activities in firms. Sileshi (2012) identified the following barriers to innovation:
a. Lack of Finance: Lack of finance is one of the most important constraints to innovation (Silva et al., 2007; Lim & Shyamala, 2007). For instance, lack of financial resources appears to be perceived as more important by small firms than large firms while organizational factors are more relevant barriers for large enterprises than small. According to Hall (2002) financial problems are particularly critical in the case of innovation activities due to: innovation projects are riskier than physical investment projects and therefore outside investors require a risk premium for the financing of innovation activities. Secondly, innovators are reluctant to share with outside investor’s (may not invest) information about their innovation. Some innovation projects may not be started, delayed or abandoned because of the risk of bankruptcy and the low value of intangibles in case of liquidation (Gomes, Yaron & Zhang, 2006).
b. High Cost Innovation: Cost of innovation was major important barrier to innovation. Tourigny and Le (2004) found that the high cost of innovation is likely to be perceived as an important hampering factor by firms.
c. Lack of Skill Personnel: Many firms’ managerial and technical skills, which inhibit their effectiveness as well as competitiveness in new technology adoption and retention. The lack of technically qualified personnel has been found to negatively affect the ability of firms to innovate.
d. Inadequate R&D: Empirical evidence showed that, R&D enhances innovation. Thus, without adequate knowledge, information and systemic analysis, via R&D it is even harder to assess potentials and threats of the global business to the companies.
e. Lack of Collaboration: It is becoming more and more difficult to maintain a competitive advantage through internal R&D because of the fast changing environment and the increase of knowledge dissemination and expansion via the internet, Nowadays large multinational companies are looking to generate knowledge externally through acquisitions, venture capital investments and collaborations (Kang & Kang, 2009).
Robson, Paul, Halen, and Bernard, (2009) found that contacts with other firms, either locally or in export markets, appeared to stimulate innovation activity. Firms engage in collaboration in order to complement their internal resources and accordingly team up with partners who control the relevant complementary resources required (Miotti & Sachwald, 2003).
4.3 Benefiting From Innovation
Technology and Innovation leadership in firms does not necessarily result in economic benefits. Tee (1998) argues that the capacity of the firm to appropriate the benefits of investment in technology depends on two factors: (i) the firm’s capacity to translate its technological advantage into commercially viable products or processes (innovation); and (ii) the firm’s capacity to defend its advantage against imitators. We identify some of the factors that influence the firm’s capacity to benefit commercially from its technological innovation as follows:
a. Secrecy is considered by industrial managers to be an effective form of protection, especially for process innovations. However, it is unlikely to provide absolute protection, because some process characteristics can be identified from an analysis of the final product, and because process engineers are a professional community.
b. The Learning Curve in production generates both lower costs, and a particular and powerful form of accumulated and largely tacit knowledge that is well recognized by practitioners. In certain industries and technologies (e.g. semiconductors, continuous processes), the first-comer advantages are potentially large, given the major possibilities for reducing unit costs with increasing cumulative production. However, such experience curves’ are not automatic, and require continuous investment in training, and learning.
c. Complementary Assets, the effective commercialization of an innovation very often depends on assets (or competencies) in production, marketing and after-sales to complement those in technology.
d. Product Complexity, for example previously IBM could rely on the size and complexity of their mainframe computers as an effective barrier against imitation, given the long lead times required to design and build copy products. With the advent of microprocessor and standard software, these technological barriers to imitation disappeared and IBM was faced in the late 1980s with strong competition from IBM ‘clones’, made in the USA and East Asia. Boeing and Airbus have faced no such threat to their positions in large civilian aircraft, since the costs and lead times for imitation remain very high. Managers recognize that product complexity is an effective barrier to imitation.
e. Strength of Patent Protection can be strong determinant of the relative commercial benefits to innovators and imitators. Patents are judged to be more effective than process innovations in protecting product innovation in all sectors except petroleum refining, probably reflecting the importance of improvements in chemical catalysts for increasing process efficiency. It also shows that patent protection is rated more highly in chemical-related sectors (especially drugs) than in other sectors. This is because it is generally more difficult to ‘invent around’ a clearly specified chemical formula than around other forms of invention.
Finally, we should note that firms can use more than one of the above factors to defend their innovative lead. For example, in the pharmaceutical industry, secrecy is paramount during the early phases of research, but in the later stages, research patents-where much basic information is disclosed-become critical. Complementary assets, such as global sales and distribution, become more important at the commercialization stage.





5.0 TECHNOLOGY & INNOVATION MANAGEMENT (T&IM)
Technology Management (TM), alternatively referred to as Management of Technology (MOT) was set into motion when man invented the wheel, now however, has become an organized and systematic discipline. Thamhain (2005, p.6) view MoT as the art and science of creating value by using technology together with other resources of an organization. According to National Research Council (1987) MoT links,engineering,science and management disciplines to plan, develop and implement technological capabilities to shape and accomplish the strategic and operational goals of an organization. Table 3 describes the complementary roles of science/engineering vis-à-vis management disciples in T&IM, it is evident no one discipline will achieve sustainable success in T&IM without the other.
Table 3: Roles of Science/Engineering vs Management Disciplines in Technology & Innovation Management

Science & Engineering Disciplines 
Management Disciplines 

Areas
Biology, Chemistry, Computing, Electrical, Geology, Mechanical , Physics, etc.
Accounting, Economics, Entrepreneurship, Finance, Marketing, Sociology etc.

Tasks
Inventions & Discoveries
Innovation & Diffusion 

Method
Experiments & Simulation
Survey & Focus Group Analysis

Tech Resource
Machines and to a lesser extend manpower
Manpower and Method

Tech Product
Development/improvement
Feasibility studies & Organizational alignment

Innovation
Product & Process
Marketing & Organizational

Outcome
Technical (Creation)
Commercial

 As TM embraces several interconnected issues ranging from policy planning at the national to strategic planning at the firm level, it calls for decisions and result-oriented actions at the macro- as well as micro-levels and a effective macro-micro linkage as described on table 4.



Table 4: Scope of Technology & Innovation Management

Macro Level
Micro Level

Focus
Economic Growth & Development
Profit

Scope
National/Industrial Technological capabilities
Firms Technological Capabilities

Industry
Industry Regulators
Industry Players

Competitors
Other Countries
Other firms in the industry

Task
Creating Enabling Environment
Competing in the Operating Environment

Institutions
Gov’t Ministries, Directories & Agencies
Individual Firms

Both macro and Micro technology management seek to raise economic efficiency. Micro TM is the basis for macro TM, while the latter provides guidelines and an environment for the former. Consistency among these two levels of management is essential, but institutional mechanism will largely determine whether they are effectively combined. While macro-support could catalyze changes, the real actions must take place at the industry level.
5.1 Importance of Technology & Innovation Management
A number of factors meant that T&IM is of crucial importance to individuals, communities, organizations and countries. These factors include mankind ability to understand, dominate and control the environment, national competitiveness, etc. a number of these factors are discussed below:
a. Mankind Ability to dominate the Environment: Usually, man conquers nature, changes nature, and attains freedom from nature through invention and innovation. With technological invention and innovation, mankind has pulled itself from the mud huts of nut and berry gatherers through the Stone, Bronze, and Iron Ages, the Industrial rev olution, and into what has been called Atomic Age, Electronic Age, Computer Age, the Second Industrial Revolution, the Third Industrial Revolution, Internet Age and the emerging Internet of Things (IoTs) Age etc. So both invention and innovation are important weapons attaining freedom from nature, and are the important symbols of mankind’s civilization and progress.
b. National Competitiveness: T&IM is also critical for national competitiveness; this is because where countries are not in a position to engage effectively in innovation activities, they are inevitably dependent on other countries innovated products, imported by hard currency from developed and other developing countries. This is typically holds true for countries like Ethiopia and Nigeria. It is not by coincidence that countries such as US, Japan, South Korea, Canada etc. that leverage T&I are most developed while countries that score low in exploiting T&I are underdeveloped. Under the later situation nations become victims not beneficiaries of T&I. As such full exploitation of T&I offers boundless potential for improving economies of developing nations. Recently Japan came up with the policy of using robotic technology to circumvent the projected manpower shortfall for the optimal performance of the economy.
c. Industrial Impact: One hundred years ago, in 1911, Schumpter (1961) argued that technological change is the major factor shaping the growth, decline and structure of industries across the world. Specifically he professed that technology is the ultimate force behind the emergence, evolution, fusion, and disruption of industries over time, table 5 provides examples to support this assertion.
Table 5: Technology & Industry Change
S/N
Type of Influence
Technologies
Industry/Industries
Examples

1
Emergence
Handsets and Wireless
Telecommunications
Mobile Phone

2
Evolution
Internet & Extranet
Banking and Education
E-Banking & E-Learning

3
Fusion
Intranet and Extranet
Banking & Insurance
Universal Banking

4
Disruption
Satellite Dish and TV
Entertainment
Cinema

d. Technology’s Omnipresence: T &I have such a pervasive influence on individuals, communities, firms, industries, nations and even supernatural institutions. As highlighted earlier the richest people and firms made their fortunes via T&I, this also applies to nations and communities. At individual and family levels, quality of life and progress are functions of ability to capitalize on T&I capabilities.
e. Ambivalent Impacts: In spite of all the aforementioned T&I impact is not always positive. In fact significant number of business failures are attributed to inability of inventors/innovators to translate technological creativity into profitable operation as a result T&I had led to spectacular corporate loss not only (merely) wealth creation.
f. Strategic Importance: Technology has become a key strategic element (Porter, 1985) and hence the need for new management approaches to synchronise technology with business strategy (Mitchel, 1988). The MIT commission on industrial productivity concluded as early as 1990 that:
For too long business schools have taken the position that a good manager could manage anything, regardless of its technological base. It is now clear that view is wrong. While it is not necessary for every manger to have a science or engineering degree, every manager does need to understand how technology relates to the strategic positioning of the firm to evaluate alternative technologies and investment choice, and how to shepherd scientific and technical concepts through the innovation and production processes to marketplace (Dertouzos, Lester, & Solow, 1990).











6. 0 TECHNOLOGY & INNOVATION THEORIES
6.1Technology Theories
There are basically two extreme schools of thoughts about technology theory, the deterministic and instrumentalist theory. Surry (1997) summarizes the major distinction between these schools of thoughts.
a. Technology Instrumental, view technology as a mere tool. The instrumentalist often cite the knife as an example, a tool that can be used for either good or evil, depending upon the intentions of the person employing the tool. Hence they view technology as a tool, largely under human control, that can be used for either positive or negative purpose. Instrumentalist suggest social change and humans need as key to success in the marketplace, in essence no matter how good a technology is, unless it satisfy human need it’s not likely going to succeed. 
b. Technological Determinists view technology as an autonomous force, beyond direct human control, and see technology as the prime cause of social change and global happening. Determinist see technology as the most powerful force in changing individuals, societies, communities, national and the entire universe, as such the determinists view a very powerful and well design technology as essentially having a significant impact in the marketplace a sort of product concept ideology.
c. Socio-Technical Theory (STT) argued that organizational success relies on firms’ ability to achieve good blend between its social and technical sub-systems (French & Ball, 1999). In its simplest form the theory argued that organizations consist of two interdependent sub-systems: a social system and a technical system, and changes in one system significantly affect changes in the other (French & Ball, 1999). The social-sub system comprises organizational employees as well as their knowledge, needs, interactions and commitments. While the technical sub-system of consist of tools, techniques, procedures and knowledge used by organization (Kontoghiorghes, 2005).
6.2 Innovation Theories
Four of the theories discussed by Rogers are among the most widely-used theories of Innovation Diffusion as follows:
a. Innovation Decision Process Theory states that diffusion is a process that occurs overtime and can be seen as having five distinct stages, thus Knowledge, Persuasion, Decision, Implementation, and Confirmation (Rodgers, 1995). According to this theory, potentials adopters of an innovation must learn about the innovation, be persuaded as to the merits of the innovation, decide to adopt the innovation.
b. Individual Innovativeness Theory states individuals who are predisposed to being innovative will adopt an innovation earlier than those who are less predisposed (Rogers, 1995). Figure 6 shows the bell shaped distribution of an Individual Innovativeness and the percentage of potential adapters theorized to fall into each category. On one extreme of the distribution are the Innovators. Innovators are the risk taker sand pioneers who adopt an innovation very early in the diffusion process. On the other extreme are the Laggards who resist adopting an innovation until rather late in the diffusion process, if ever.






 INNOVATION EARLY EARLY LATE LAGGARS
          2.5% ADOPTERS MAJOTITY MAJORITY 16%
              13.5% 34% 34%  
c. Theory of Rate of Adoption states that innovations are diffused over time in a pattern that resembles an s-shape curve (Rogers, 1995). Rate of Adoption theorizes that an innovation goes through a period of slow, gradual growth before experiencing a period of relatively dramatic and rapid growth. An example of how rate of adoption might typically be presented by an s-curve is shown in figure 7. The theory also states that following the period of rapid growth, the innovation’s rate of adoption will gradually stabilize and eventually decline.

Number or percentage 
        of Adopters
     
 Period of
          Rapid Growth



Time

                                      Figure 7: S-Curve Rate of Adoption
d. Theory of Perceived Attributes states that potential adopter’s judge an innovation based on their perceptions regarding five attributes of the innovation (Rogers, 1995). These attributes are: Trialability; Observability; Relative Advantage; Complexity; and Compatibility. The theory holds that an innovation will experience an increased rate of diffusion if potential adopters perceive that the innovation:
 (1) Can be tried on a limited basis before adoption.
 (2) Offers observable results.
 (3) Has an advantage relative to other innovations (or the status quo)
 (4) Is not overly complex
(5) Is Compatible with existing practices and values.

MANAGEMENT INFORMATION SYSTEM

Management levels In an organization, there are three different levels of management, each of which requires different types of information ...