HCMUS - Artificial Intelligence 2013

Undergraduate course, University of Science, Faculty of Mathematics and Computer Sciences, 2013

The topics for weekly homework in our AI2013 class.

Week 1

  • Using MATLAB for the exercises (2.7, 2.8, 3.9 and 3.10) in textbook.
  • 2.7 Implement a performance-measuring environment simulator for the vacuum-cleaner world depicted in Figure 2.2 and specified on page 36. Your implementation should be modular, so that the sensors, actuators, and environment characteristics (size, shape, dirt placement, etc.) can be changed easily.
  • 2.8 Implement a simple reflex agent for the vacuum environment in Exercise 2.7. Run the environment simulator with this agent for all possible initial dirt configurations and agent locations. Record the agent’s performance score for each configuration and its overall average score.
  • 3.9 The missionaries and cannibals problem is usually stated as follows. Three missionaries and three cannibals are on one side of a river, along with a boat that can hold one or two people. Find a way to get everyone to the other side, without ever leaving a group of missionaries in one place outnumbered by the cannibals in that place. This problem is famous in AI because it was the subject of the first paper that approached problem formulation from an analytical viewpoint (Amarel, 1968).
  • 3.10 Implement two versions of the successor function for the 8-puzzle. Write versions of iterative deepening depth-first search that use these functions and compare their performance.

Week 2

  • Shortest path in weighted graph problem using A* algorithm to solve this problem.

Week 3

  • 3.15 questions b and c (page 91 in textbook). This problem is finding the shortest path between two points on a plane that has convex polygonal obstacles.

Week 5

  • Multivariate Linear Regression aims to investigate multivariate linear regression using gradient descent and the normal equations. You will also examine the relationship between the cost function J(\theta), the convergence of gradient descent, and the learning rate \neta.

Leave a Comment

Your email address will not be published. Required fields are marked *

Loading...