Skip to content

Commit 984791d

Browse files
authored
Add files via upload
1 parent 38ee6ff commit 984791d

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

43 files changed

+6246
-0
lines changed
Binary file not shown.
Binary file not shown.

week7/machine-learning-ex6.zip

1.05 MB
Binary file not shown.

week7/machine-learning-ex6/ex6.pdf

564 KB
Binary file not shown.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
function [C, sigma] = dataset3Params(X, y, Xval, yval)
2+
%DATASET3PARAMS returns your choice of C and sigma for Part 3 of the exercise
3+
%where you select the optimal (C, sigma) learning parameters to use for SVM
4+
%with RBF kernel
5+
% [C, sigma] = DATASET3PARAMS(X, y, Xval, yval) returns your choice of C and
6+
% sigma. You should complete this function to return the optimal C and
7+
% sigma based on a cross-validation set.
8+
%
9+
10+
% You need to return the following variables correctly.
11+
C = 1;
12+
sigma = 0.3;
13+
14+
% ====================== YOUR CODE HERE ======================
15+
% Instructions: Fill in this function to return the optimal C and sigma
16+
% learning parameters found using the cross validation set.
17+
% You can use svmPredict to predict the labels on the cross
18+
% validation set. For example,
19+
% predictions = svmPredict(model, Xval);
20+
% will return the predictions on the cross validation set.
21+
%
22+
% Note: You can compute the prediction error using
23+
% mean(double(predictions ~= yval))
24+
%
25+
26+
27+
28+
29+
30+
31+
32+
% =========================================================================
33+
34+
end
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,61 @@
1+
function x = emailFeatures(word_indices)
2+
%EMAILFEATURES takes in a word_indices vector and produces a feature vector
3+
%from the word indices
4+
% x = EMAILFEATURES(word_indices) takes in a word_indices vector and
5+
% produces a feature vector from the word indices.
6+
7+
% Total number of words in the dictionary
8+
n = 1899;
9+
10+
% You need to return the following variables correctly.
11+
x = zeros(n, 1);
12+
13+
% ====================== YOUR CODE HERE ======================
14+
% Instructions: Fill in this function to return a feature vector for the
15+
% given email (word_indices). To help make it easier to
16+
% process the emails, we have have already pre-processed each
17+
% email and converted each word in the email into an index in
18+
% a fixed dictionary (of 1899 words). The variable
19+
% word_indices contains the list of indices of the words
20+
% which occur in one email.
21+
%
22+
% Concretely, if an email has the text:
23+
%
24+
% The quick brown fox jumped over the lazy dog.
25+
%
26+
% Then, the word_indices vector for this text might look
27+
% like:
28+
%
29+
% 60 100 33 44 10 53 60 58 5
30+
%
31+
% where, we have mapped each word onto a number, for example:
32+
%
33+
% the -- 60
34+
% quick -- 100
35+
% ...
36+
%
37+
% (note: the above numbers are just an example and are not the
38+
% actual mappings).
39+
%
40+
% Your task is take one such word_indices vector and construct
41+
% a binary feature vector that indicates whether a particular
42+
% word occurs in the email. That is, x(i) = 1 when word i
43+
% is present in the email. Concretely, if the word 'the' (say,
44+
% index 60) appears in the email, then x(60) = 1. The feature
45+
% vector should look like:
46+
%
47+
% x = [ 0 0 0 0 1 0 0 0 ... 0 0 0 0 1 ... 0 0 0 1 0 ..];
48+
%
49+
%
50+
51+
52+
53+
54+
55+
56+
57+
58+
% =========================================================================
59+
60+
61+
end
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
> Anyone knows how much it costs to host a web portal ?
2+
>
3+
Well, it depends on how many visitors you're expecting.
4+
This can be anywhere from less than 10 bucks a month to a couple of $100.
5+
You should checkout http://www.rackspace.com/ or perhaps Amazon EC2
6+
if youre running something big..
7+
8+
To unsubscribe yourself from this mailing list, send an email to:
9+
groupname-unsubscribe@egroups.com
10+
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
Folks,
2+
3+
my first time posting - have a bit of Unix experience, but am new to Linux.
4+
5+
6+
Just got a new PC at home - Dell box with Windows XP. Added a second hard disk
7+
for Linux. Partitioned the disk and have installed Suse 7.2 from CD, which went
8+
fine except it didn't pick up my monitor.
9+
10+
I have a Dell branded E151FPp 15" LCD flat panel monitor and a nVidia GeForce4
11+
Ti4200 video card, both of which are probably too new to feature in Suse's default
12+
set. I downloaded a driver from the nVidia website and installed it using RPM.
13+
Then I ran Sax2 (as was recommended in some postings I found on the net), but
14+
it still doesn't feature my video card in the available list. What next?
15+
16+
Another problem. I have a Dell branded keyboard and if I hit Caps-Lock twice,
17+
the whole machine crashes (in Linux, not Windows) - even the on/off switch is
18+
inactive, leaving me to reach for the power cable instead.
19+
20+
If anyone can help me in any way with these probs., I'd be really grateful -
21+
I've searched the 'net but have run out of ideas.
22+
23+
Or should I be going for a different version of Linux such as RedHat? Opinions
24+
welcome.
25+
26+
Thanks a lot,
27+
Peter
28+
29+
--
30+
Irish Linux Users' Group: ilug@linux.ie
31+
http://www.linux.ie/mailman/listinfo/ilug for (un)subscription information.
32+
List maintainer: listmaster@linux.ie
33+
34+

week7/machine-learning-ex6/ex6/ex6.m

+150
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,150 @@
1+
%% Machine Learning Online Class
2+
% Exercise 6 | Support Vector Machines
3+
%
4+
% Instructions
5+
% ------------
6+
%
7+
% This file contains code that helps you get started on the
8+
% exercise. You will need to complete the following functions:
9+
%
10+
% gaussianKernel.m
11+
% dataset3Params.m
12+
% processEmail.m
13+
% emailFeatures.m
14+
%
15+
% For this exercise, you will not need to change any code in this file,
16+
% or any other files other than those mentioned above.
17+
%
18+
19+
%% Initialization
20+
clear ; close all; clc
21+
22+
%% =============== Part 1: Loading and Visualizing Data ================
23+
% We start the exercise by first loading and visualizing the dataset.
24+
% The following code will load the dataset into your environment and plot
25+
% the data.
26+
%
27+
28+
fprintf('Loading and Visualizing Data ...\n')
29+
30+
% Load from ex6data1:
31+
% You will have X, y in your environment
32+
load('ex6data1.mat');
33+
34+
% Plot training data
35+
plotData(X, y);
36+
37+
fprintf('Program paused. Press enter to continue.\n');
38+
pause;
39+
40+
%% ==================== Part 2: Training Linear SVM ====================
41+
% The following code will train a linear SVM on the dataset and plot the
42+
% decision boundary learned.
43+
%
44+
45+
% Load from ex6data1:
46+
% You will have X, y in your environment
47+
load('ex6data1.mat');
48+
49+
fprintf('\nTraining Linear SVM ...\n')
50+
51+
% You should try to change the C value below and see how the decision
52+
% boundary varies (e.g., try C = 1000)
53+
C = 1;
54+
model = svmTrain(X, y, C, @linearKernel, 1e-3, 20);
55+
visualizeBoundaryLinear(X, y, model);
56+
57+
fprintf('Program paused. Press enter to continue.\n');
58+
pause;
59+
60+
%% =============== Part 3: Implementing Gaussian Kernel ===============
61+
% You will now implement the Gaussian kernel to use
62+
% with the SVM. You should complete the code in gaussianKernel.m
63+
%
64+
fprintf('\nEvaluating the Gaussian Kernel ...\n')
65+
66+
x1 = [1 2 1]; x2 = [0 4 -1]; sigma = 2;
67+
sim = gaussianKernel(x1, x2, sigma);
68+
69+
fprintf(['Gaussian Kernel between x1 = [1; 2; 1], x2 = [0; 4; -1], sigma = %f :' ...
70+
'\n\t%f\n(for sigma = 2, this value should be about 0.324652)\n'], sigma, sim);
71+
72+
fprintf('Program paused. Press enter to continue.\n');
73+
pause;
74+
75+
%% =============== Part 4: Visualizing Dataset 2 ================
76+
% The following code will load the next dataset into your environment and
77+
% plot the data.
78+
%
79+
80+
fprintf('Loading and Visualizing Data ...\n')
81+
82+
% Load from ex6data2:
83+
% You will have X, y in your environment
84+
load('ex6data2.mat');
85+
86+
% Plot training data
87+
plotData(X, y);
88+
89+
fprintf('Program paused. Press enter to continue.\n');
90+
pause;
91+
92+
%% ========== Part 5: Training SVM with RBF Kernel (Dataset 2) ==========
93+
% After you have implemented the kernel, we can now use it to train the
94+
% SVM classifier.
95+
%
96+
fprintf('\nTraining SVM with RBF Kernel (this may take 1 to 2 minutes) ...\n');
97+
98+
% Load from ex6data2:
99+
% You will have X, y in your environment
100+
load('ex6data2.mat');
101+
102+
% SVM Parameters
103+
C = 1; sigma = 0.1;
104+
105+
% We set the tolerance and max_passes lower here so that the code will run
106+
% faster. However, in practice, you will want to run the training to
107+
% convergence.
108+
model= svmTrain(X, y, C, @(x1, x2) gaussianKernel(x1, x2, sigma));
109+
visualizeBoundary(X, y, model);
110+
111+
fprintf('Program paused. Press enter to continue.\n');
112+
pause;
113+
114+
%% =============== Part 6: Visualizing Dataset 3 ================
115+
% The following code will load the next dataset into your environment and
116+
% plot the data.
117+
%
118+
119+
fprintf('Loading and Visualizing Data ...\n')
120+
121+
% Load from ex6data3:
122+
% You will have X, y in your environment
123+
load('ex6data3.mat');
124+
125+
% Plot training data
126+
plotData(X, y);
127+
128+
fprintf('Program paused. Press enter to continue.\n');
129+
pause;
130+
131+
%% ========== Part 7: Training SVM with RBF Kernel (Dataset 3) ==========
132+
133+
% This is a different dataset that you can use to experiment with. Try
134+
% different values of C and sigma here.
135+
%
136+
137+
% Load from ex6data3:
138+
% You will have X, y in your environment
139+
load('ex6data3.mat');
140+
141+
% Try different SVM Parameters here
142+
[C, sigma] = dataset3Params(X, y, Xval, yval);
143+
144+
% Train the SVM
145+
model= svmTrain(X, y, C, @(x1, x2) gaussianKernel(x1, x2, sigma));
146+
visualizeBoundary(X, y, model);
147+
148+
fprintf('Program paused. Press enter to continue.\n');
149+
pause;
150+

0 commit comments

Comments
 (0)