302– Decision Science University Syllabus 2023
Chapter 1: Introduction of Decision Science
Decision Sciences is a field that combines various quantitative and qualitative methods to analyze complex problems and make informed decisions. It involves a multidisciplinary approach that draws on fields such as mathematics, statistics, economics, psychology, and computer science to develop analytical methods and tools to improve decision-making.
The importance of Decision Sciences lies in its ability to help individuals and organizations make data-driven decisions. In today's complex and rapidly changing business environment, decision-making is often complicated by a vast amount of data and multiple factors to consider. Decision Sciences offers a structured approach to decision-making by providing models and tools to analyze data and evaluate options.
Quantitative techniques play a critical role in decision-making as they provide a structured and objective approach to analyze data and evaluate options. They help decision-makers to make informed decisions by providing a systematic and scientific method to understand the relationships between different variables and make predictions about the likely outcomes of different decisions. Here are some specific ways in which quantitative techniques can aid decision-making:
Data Analysis: Quantitative techniques enable decision-makers to analyze data and extract insights from it. By using statistical methods, data can be analyzed to identify trends, patterns, and correlations that can provide valuable insights to support decision-making.
Forecasting: Quantitative techniques can be used to forecast future outcomes based on historical data. This helps decision-makers to understand the likely consequences of different decisions and make informed choices.
Optimization: Quantitative techniques can be used to optimize decisions by finding the best solution to a problem based on a set of constraints. This can help decision-makers to identify the most efficient way to allocate resources, minimize costs, or maximize profits.
Simulation: Quantitative techniques can be used to simulate different scenarios to predict their likely outcomes. This can help decision-makers to understand the potential consequences of different decisions and make informed choices.
An assignment model is a mathematical optimization model that is used to determine the best assignment of resources or tasks to a set of agents or machines. In this model, the objective is to find the optimal assignment that minimizes or maximizes a specific objective function, subject to a set of constraints.
The model is used in various applications, such as workforce scheduling, transportation planning, and production scheduling. The assignment model is commonly used in situations where a set of tasks or jobs must be performed by a group of agents or machines, and there are costs or benefits associated with each possible assignment.
Flood's technique, also known as the Hungarian method, is an algorithm used to solve assignment problems. It was developed by Harold Kuhn in 1955 and is based on the work of two Hungarian mathematicians, Dénes Kőnig and Jenő Egerváry. The Hungarian method is an efficient algorithm for solving assignment problems and can handle large problems with many agents and tasks. It is widely used in operations research, logistics, and other fields where resource allocation problems need to be solved.
Types of Assignment Problem
- Balanced Assignment Problem: The balanced assignment problem is a type of assignment problem in which there are an equal number of agents or machines and tasks to be assigned. The objective is to assign each task to a single agent or machine in a way that minimizes the total cost or maximizes the total benefit.
Unbalanced Assignment Problem: The unbalanced assignment problem is a type of assignment problem in which the number of agents or machines is not equal to the number of tasks to be assigned. In other words, there may be more tasks than agents or more agents than tasks. The objective is to assign as many tasks as possible to agents or machines, subject to capacity constraints and a minimum cost or maximum benefit.
3. Minimization Assignment Problem: In an assignment problem, the objective is typically either to minimize the cost or to maximize the benefit. The minimization assignment problem is a type of assignment problem in which the objective is to minimize the total cost of assigning tasks to agents or machines.
4. Maximization Assignment Problem: The maximization assignment problem is a type of assignment problem in which the objective is to maximize the total benefit or profit of assigning tasks to agents or machines. In this problem, each task has a certain benefit or profit associated with it, and the objective is to find the assignment that maximizes the total benefit while satisfying certain constraints.Transportation models are mathematical models used to optimize the transportation of goods or materials from multiple origins to multiple destinations while minimizing transportation costs or maximizing profits. This model can be used to solve problems such as determining the optimal distribution of products from manufacturing plants to retail stores, the routing of supplies to construction sites, or the transportation of raw materials to production facilities.
Basic initial solution transportation Model: The initial basic feasible solution is a crucial step in solving transportation problems using the transportation model. There are different methods to obtain an initial basic feasible solution, including the Northwest Corner Method, the Least Cost Method, and Vogel's Approximation Method.
- The Northwest Corner Method is a simple method that starts by allocating as much as possible to the first row and first column, then moving to the next row and column, and repeating the process until all supply and demand constraints are satisfied. This method is easy to understand and apply, but it may not always result in an optimal solution.
2. The Least Cost Method starts by allocating goods from the origin with the lowest unit transportation cost to the destination with the lowest unit transportation cost until the capacity of either the origin or destination is exhausted. The process is repeated until all supply and demand constraints are satisfied. This method may produce a better initial solution than the Northwest Corner Method but is more time-consuming.
3. Vogel's Approximation Method considers the differences in the unit transportation costs between the origins and destinations. The method calculates the penalties for not selecting the lowest-cost origin or destination, and then assigns goods to the origin or destination with the lowest penalty. This process is repeated until all supply and demand constraints are satisfied.
- The optimal solution for a transportation model is the solution that minimizes the total transportation cost while satisfying all the supply and demand constraints.
- The stepping stone method involves examining all possible ways of reallocating units of supply from a basic variable to a non-basic variable, and calculating the resulting change in total transportation cost.
Chapter 2: Linear Programming
Linear programming is a mathematical technique used to find the optimal solution for a linear objective function subject to a set of linear constraints. It is widely used in various fields, such as operations research, economics, engineering, and management.
Concept: Linear programming involves maximizing or minimizing a linear objective function subject to a set of linear constraints. The linear objective function represents the quantity that needs to be optimized, while the linear constraints represent the limitations or restrictions on the decision variables.
Examples:
Markov chains are a mathematical model used to analyze the behavior of a system over time. They have a wide range of applications in various functional areas of management, such as finance, marketing, operations, and human resources. Here are some examples:
Finance: Markov chains are used to analyze the behavior of financial assets, such as stocks, bonds, and currencies. They can help investors and financial analysts to estimate the probability of future prices or returns based on historical data.
Marketing: Markov chains are used to analyze customer behavior and loyalty. They can help companies to estimate the probability of a customer switching from one product or brand to another, and to develop marketing strategies to retain or attract customers.
Operations: Markov chains are used to analyze the behavior of production processes and supply chains. They can help companies to estimate the probability of a machine breakdown, a production delay, or a supply chain disruption, and to develop contingency plans to mitigate these risks.
- Human resources: Markov chains are used to analyze employee behavior and career paths. They can help companies to estimate the probability of an employee leaving the company, getting promoted, or changing departments, and to develop retention and talent management strategies.
Estimation of transition probabilities
The transition probabilities in a Markov chain represent the probability of moving from one state to another in a given time period. They are essential for analyzing the behavior of the system and predicting future outcomes.
There are several methods for estimating transition probabilities, depending on the data availability and the complexity of the system. Here are some common methods:
Historical data: If the system has a long and reliable history, we can estimate the transition probabilities based on past observations. This method is simple and straightforward but assumes that the future behavior of the system is similar to the past behavior.
Expert judgment: If the system is complex or has limited data availability, we can rely on expert judgment to estimate the transition probabilities. This method involves asking experts or stakeholders to provide their opinions or estimates based on their knowledge and experience.
Surveys and experiments: If the system involves human behavior or preferences, we can conduct surveys or experiments to collect data and estimate the transition probabilities. This method involves designing surveys or experiments that simulate the behavior of the system and analyzing the results.
Simulation and modeling: If the system is too complex or dynamic, we can use simulation and modeling techniques to estimate the transition probabilities. This method involves developing a mathematical or computational model that simulates the behavior of the system and estimating the transition probabilities based on the model outputs.
Monte Carlo simulation is a computational technique used to model the behavior of complex systems and estimate the probability of different outcomes. It is widely used in various fields, such as finance, engineering, and science, to analyze risk, optimize decision-making, and evaluate hypotheses.
Scope:
Financial modeling: Monte Carlo simulation can be used to model stock prices, option prices, and portfolio returns, and estimate the probability of different investment outcomes.
Engineering design: Monte Carlo simulation can be used to model the behavior of complex systems, such as aerospace vehicles, nuclear reactors, and chemical plants, and evaluate the impact of different design parameters and operating conditions.
Climate modeling: Monte Carlo simulation can be used to model the behavior of the earth's climate system and estimate the probability of different climate scenarios, such as temperature changes and sea level rise.
Drug development: Monte Carlo simulation can be used to model the behavior of drug molecules and estimate the probability of different drug efficacy and safety outcomes.
Limitations:
Assumptions and simplifications: Monte Carlo simulation models are based on assumptions and simplifications that may not capture the full complexity of the real-world system. This can lead to inaccuracies and biases in the model outputs.
Computation time and resources: Monte Carlo simulation involves running a large number of simulations, which can require significant computation time and resources, especially for complex systems with many variables and parameters.
Data availability and quality: Monte Carlo simulation models rely on input data, which may not be available or may be of poor quality, leading to inaccuracies in the model outputs.
Interpretation and communication: Monte Carlo simulation models can produce a large amount of output data, which can be difficult to interpret and communicate to stakeholders, especially those who are not familiar with the technique.
Chapter 3: Probability
Probability is a branch of mathematics that deals with the study of random events and their likelihood of occurrence. It provides a way to quantify uncertainty and make informed decisions based on available information.
Concept:
Probability is usually expressed as a number between 0 and 1, where 0 represents an impossible event and 1 represents a certain event. The probability of an event is calculated by dividing the number of favorable outcomes by the total number of possible outcomes.
For example, if we toss a fair coin, the probability of getting a head is 1/2, because there is only one favorable outcome (getting a head) out of two possible outcomes (getting a head or a tail).
Conditional Probability: is the probability of an event given that another event has occurred. It is calculated by dividing the probability of the intersection of the two events by the probability of the conditional event.
For example, suppose we have two bags of marbles. Bag A contains 3 red and 2 blue marbles, while Bag B contains 2 red and 4 blue marbles. We randomly select one of the bags and then randomly select a marble from that bag. If we know that the selected marble is red, what is the probability that it came from Bag A?
The conditional probability of selecting Bag A given that a red marble was chosen can be calculated using Bayes' theorem:
P(A|red) = P(red|A) * P(A) / P(red)
where P(A|red) is the probability of selecting Bag A given that a red marble was chosen, P(red|A) is the probability of selecting a red marble from Bag A, P(A) is the probability of selecting Bag A, and P(red) is the overall probability of selecting a red marble.
Decision Making:
Probability theory can be used to make informed decisions by calculating the expected value of different outcomes and choosing the option with the highest expected value.
For example, suppose a company is considering two different marketing strategies for a new product. Strategy A has a 60% chance of generating $100,000 in profit and a 40% chance of generating $50,000 in profit, while Strategy B has a 80% chance of generating $80,000 in profit and a 20% chance of generating $30,000 in profit.
The expected value of Strategy A can be calculated as:
(0.6 * $100,000) + (0.4 * $50,000) = $80,000
The expected value of Strategy B can be calculated as:
(0.8 * $80,000) + (0.2 * $30,000) = $74,000
Based on this analysis, the company should choose Strategy A, since it has a higher expected value.
Probability applications in business
Risk assessment and management: Probability theory is used to quantify and manage risks in business. For example, insurance companies use probability models to estimate the likelihood of different events, such as accidents or natural disasters, and set premiums accordingly. Similarly, businesses use probability models to evaluate the risks associated with different investment options, such as stocks or bonds.
Quality control: Probability theory is used to monitor and improve product and service quality. For example, Six Sigma is a quality control methodology that uses statistical tools, including probability distributions, to identify and eliminate defects in manufacturing processes.
Marketing and sales: Probability theory is used to predict consumer behavior and evaluate marketing and sales strategies. For example, businesses use probability models to estimate the likelihood of a customer purchasing a product, given certain marketing stimuli such as advertising or promotions. This helps businesses to optimize their marketing and sales efforts to achieve maximum return on investment.
Financial analysis: Probability theory is used to analyze financial data and make informed investment decisions. For example, businesses use probability models to estimate the risk and return of different investment options, such as stocks or bonds, and to optimize their investment portfolios accordingly.
Operations management: Probability theory is used to optimize business operations, such as production and inventory management. For example, businesses use probability models to estimate demand for their products and services, and to optimize production and inventory levels to meet that demand while minimizing costs.
Probability distributions are mathematical functions that describe the likelihood of different outcomes of a random variable in a probabilistic model. In statistics, probability distributions are used to model the uncertainty and randomness of data, and they play a fundamental role in statistical inference and decision making.
Normal Distribution: The normal distribution, also known as the Gaussian distribution, is a continuous probability distribution that is symmetric around its mean. It is used to model many natural phenomena, such as heights and weights of people, test scores, and stock prices.
Binomial Distribution: The binomial distribution is a discrete probability distribution that describes the number of successes in a fixed number of independent trials with the same probability of success. It is commonly used to model binary outcomes, such as the number of heads in a coin toss or the number of defective items in a sample.
Poisson Distribution: The Poisson distribution is a discrete probability distribution that describes the probability of a certain number of events occurring in a fixed interval of time or space. It is commonly used to model rare events, such as the number of accidents in a day or the number of customers arriving at a store in an hour.
Exponential Distribution: The exponential distribution is a continuous probability distribution that describes the time between successive events in a Poisson process, where the events occur randomly and independently. It is commonly used to model the lifetime of products, reliability of systems, and waiting times.
Uniform Distribution: The uniform distribution is a continuous probability distribution that describes a random variable that is equally likely to take any value within a specified range. It is commonly used to model random numbers generated from a computer or a dice roll.
Queuing theory is a mathematical approach to the analysis and management of waiting lines or queues. It is widely used in operations research and industrial engineering to optimize the performance of service systems, such as call centers, hospitals, and transportation networks.
Single Server (M/M/I, Infinite, FIFO): The M/M/I, Infinite, FIFO queueing system is a mathematical model used to analyze the performance of a single server queueing system where customers arrive according to a Poisson process and the service times follow an exponential distribution. The system has one server that serves the customers one at a time, in the order of their arrival (first in, first out).
The M/M/I notation refers to the characteristics of the arrival and service processes:
M refers to the Poisson arrival process, where arrivals occur independently and at a constant rate λ.
M refers to the exponential service time distribution, where the service times are independent and identically distributed with a mean service rate μ.
I refers to the number of servers, which in this case is 1.
Infinite queueing system, there is no limit to the number of customers that can wait in the queue. This assumption is made to simplify the model, as it is often difficult to predict the number of customers that may arrive.
The FIFO (First-In, First-Out) rule is used to determine the order in which customers are served. This means that the customer who arrives first will be served first, and so on.
The performance measures that can be calculated using this model include the average number of customers in the system, the average waiting time in the queue, and the average time a customer spends in the system (including service time and waiting time).
The M/M/I, Infinite, FIFO model is commonly used to analyze the performance of call centers, customer service centers, and other service-oriented businesses where customers arrive randomly and are served one at a time.
The Multi Server (M/M/C, Infinite, FIFO) queueing system is an extension of the Single Server (M/M/I, Infinite, FIFO) queueing system, where there are multiple servers available to serve customers simultaneously. This model is used to analyze the performance of a queueing system where customers arrive according to a Poisson process, service times follow an exponential distribution, and there are C servers available.
The M/M/C notation refers to the characteristics of the arrival and service processes:
M refers to the Poisson arrival process, where arrivals occur independently and at a constant rate λ.
M refers to the exponential service time distribution, where the service times are independent and identically distributed with a mean service rate μ.
C refers to the number of servers available to serve customers simultaneously.
Chapter 4: Cpm and Pert
CPM (Critical Path Method) and PERT (Program Evaluation and Review Technique) are project management techniques used to plan, schedule, and control complex projects.
CPM focuses on identifying the critical path, which is the sequence of activities that determine the project's duration. The critical path represents the longest sequence of tasks in the project and cannot be delayed without affecting the project's completion date. CPM helps project managers identify the activities that are most critical to the project's success and enables them to optimize resources to ensure timely completion.
PERT is a probabilistic technique that allows project managers to estimate the project's duration and analyze the risk associated with the completion date. PERT uses three estimates for each activity: optimistic, pessimistic, and most likely. Based on these estimates, PERT calculates the expected duration for each activity and the overall project duration. PERT also identifies the critical path and helps project managers understand the impact of uncertainty on the project's completion date.
CPM (Critical Path Method) and PERT (Program Evaluation and Review Technique) are project management techniques used to plan, schedule, and control complex projects.
CPM focuses on identifying the critical path, which is the sequence of activities that determine the project's duration. The critical path represents the longest sequence of tasks in the project and cannot be delayed without affecting the project's completion date. CPM helps project managers identify the activities that are most critical to the project's success and enables them to optimize resources to ensure timely completion.
PERT is a probabilistic technique that allows project managers to estimate the project's duration and analyze the risk associated with the completion date. PERT uses three estimates for each activity: optimistic, pessimistic, and most likely. Based on these estimates, PERT calculates the expected duration for each activity and the overall project duration. PERT also identifies the critical path and helps project managers understand the impact of uncertainty on the project's completion date.
Network crashing is a project management technique used to reduce the duration of a project by adding more resources to the critical path activities. This technique involves analyzing the project's critical path and determining the activities that can be shortened by adding additional resources, such as labor, equipment, or overtime.
The crashing process involves the following steps:
identify critical path activities
Determine the cost and time required to shorten the activities:
Select the activities to be crashed
Add resources to the selected activities
Recalculate the project duration and cost
The concept of project cost involves the estimation, analysis, and control of all the expenses associated with a project. Project cost management is essential for project success, as it helps project managers make informed decisions, monitor project progress, and ensure that the project is completed within the allocated budget.
The components of project cost can be broadly classified into two categories:
Direct Costs: Direct costs are expenses that can be attributed directly to the project and are usually easily measurable. Examples of direct costs include labor, materials, equipment, and subcontractors.
Indirect Costs: Indirect costs are expenses that cannot be attributed directly to the project but are necessary for its completion. Examples of indirect costs include overhead costs, such as rent, utilities, insurance, and administrative expenses.
In the context of network crashing, the additional resources required to shorten the project's critical path will increase the project's direct costs. The increased labor cost, equipment rental cost, and overtime pay are examples of direct costs associated with network crashing. However, network crashing may also reduce the indirect costs associated with the project by shortening the project's duration and thus decreasing the overhead costs associated with the project.
- The critical path is the longest sequence of activities in a project that determines the minimum time needed to complete the project. It is an essential concept in project management and is used to schedule and manage project activities.
Chapter 5: Decision Theory
Decision Theory: Decision theory is a branch of mathematics that seeks to formalize the process of making decisions under uncertainty. It provides a framework for analyzing how people or machines make choices when faced with different options, uncertain outcomes, and varying degrees of risk.
Salvage value refers to the estimated value of an asset at the end of its useful life, or after it has been fully depreciated. It is the estimated amount that an owner would receive if the asset were sold or disposed of at the end of its useful life.
Game theory is a branch of mathematics that studies the behavior of individuals or groups in situations where the outcome of their actions depends on the actions of others. It provides a framework for analyzing strategic interactions among agents who are aware of each other's actions and preferences, and who are trying to maximize their own utility or payoff.
The average dominance method is a decision-making tool in game theory that helps players to rank their strategies based on their expected payoffs. The method is used to find a solution concept called the "dominant strategy equilibrium" (DSE) in games where players have dominant strategies.
Sequencing problems are a class of optimization problems that involve determining the order or sequence in which a set of tasks or events should be completed in order to achieve some objective.
- Laplace Criterion: The Laplace criterion is a decision-making rule that is used to evaluate alternatives when the probabilities of the outcomes are uncertain or unknown.
Decision Theory: Decision theory is a branch of mathematics that seeks to formalize the process of making decisions under uncertainty. It provides a framework for analyzing how people or machines make choices when faced with different options, uncertain outcomes, and varying degrees of risk.
Salvage value refers to the estimated value of an asset at the end of its useful life, or after it has been fully depreciated. It is the estimated amount that an owner would receive if the asset were sold or disposed of at the end of its useful life.
Game theory is a branch of mathematics that studies the behavior of individuals or groups in situations where the outcome of their actions depends on the actions of others. It provides a framework for analyzing strategic interactions among agents who are aware of each other's actions and preferences, and who are trying to maximize their own utility or payoff.
The average dominance method is a decision-making tool in game theory that helps players to rank their strategies based on their expected payoffs. The method is used to find a solution concept called the "dominant strategy equilibrium" (DSE) in games where players have dominant strategies.
Sequencing problems are a class of optimization problems that involve determining the order or sequence in which a set of tasks or events should be completed in order to achieve some objective.
Comments
Post a Comment