Month: October 2015

Decision Tree Explained

Decision Tree Explained

Decision trees are a common technique used in data mining to predict a target value based on several input data. Prediction of output value involves testing of input sample on certain rules. Each terminal node of the tree represents the output to which sample it belongs. To figure out the output, we start at the root node of the tree, and ask a sequence of questions about the features. The interior nodes are labeled with questions, and the edges or branches between them labeled by the answers and based on the attributes you eventually end in a particular leaf.

Continue reading “Decision Tree Explained”

Advertisements

Matrix Multiplication with MapReduce

Big Data possibly now has become the most used term in the tech world for this decade. Everybody is talking about it and everybody have their own understanding towards it which has made its definition quite ambiguous. Let us deconstruct this term with a situation. Suppose you are creating a database for movie ratings where rows indicate user IDs, columns indicate movies and the values of the cells indicates rating(0-5) given by user to the corresponding movie. Now this data is likely to be sparse as you can’t have a situation where all users have rated all movies. In real world situation you can conceive the sparsity of this database and the cost it takes to store this huge database/matrix.

Continue reading “Matrix Multiplication with MapReduce”

Randomly picking equal number of samples for each label in Matlab

No_of_samples are samples which will remain in your data set for each label after you execute the following code:-

classes = unique(labels);
for i=1:numel(classes)
      cur_class_ind = find(labels==classes(i));
      ind_to_remove = cur_class_ind(randperm(numel(cur_class_ind)));
      ind_to_remove = ind_to_remove(1:(numel(cur_class_ind) - no_of_samples));
      labels(ind_to_remove,:) = [];
      data(ind_to_remove,:) = [];
end

Here ‘data’ is your input dataset with m x n dimension(m=number of samples which we are trying to crop and n =number of features) and ‘labels’ is your vector containing output classes for every corresponding input sample.

Loading files iteratively for processing in Matlab

Loading files iteratively for processing in Matlab

If you are a researcher and works in Machine Learning then your work certainly would involve data processing on Matlab. Feature engineering involves extracting features from large number of files(usually csv) and these files need to be parsed so they can be loaded iteratively and processed.

12166223_10207341791961524_1589943820_n

Continue reading “Loading files iteratively for processing in Matlab”

Tips and Tricks for training Neural Network in Theano

Theano is a popular Python’s meta programming framework used for Deep Learning on top of either CPU or GPU. Purpose of this blog is to suggest some tips which you can incorporate if you are getting trouble while performing Deep Learning on your problem.

  • Constant Validation Error– If you have just started with Theano and are applying logistic regression model to your problem (MNIST’s Digit recognition is not considered as problem here), then you are likely to get constant validation error while training. If that happens you need to fix your learning rate by determining the optimal one. Start with 0.1 and keep reducing it by a factor of 10 after every epoch until you see a fall in validation error and then use that learning rate for training. Tip- Whenever you initiate training always start with a smaller dataset, say 500-1000 samples, and try to overfit your model. Give same Dataset to training, validation and test. You should get a 100% test error. Your network should have more number of nodes compared to your input so that it can fit. If this is not happening certainly there’s some bug in your implementation.
  • Gaussian Initialization– By default Theano developers have set Initialization of weights to random uniform distribution. Change it to Gaussian(normal) Distribution, you are then likely to get improved results.

Continue reading “Tips and Tricks for training Neural Network in Theano”

Installing Theano and integrating it with GPU on Ubuntu.

If you have a working experience on theano, you probably wouldn’t have forgot that such a pain in the ass task it was. So I felt it really worth to blog about it for people aspiring to get in Deep Learning. Installation instructions given on the official website are capable enough to break down the morale of any newbie who wants to get started.  Follow the instructions
mentioned below to setup theano on Ubuntu.

Continue reading “Installing Theano and integrating it with GPU on Ubuntu.”

Recommender Systems Simplified

Recommender Systems Simplified

Almost every regular internet user has now become accustomed to personalized recommendations. Everyone is familiar with Recommender Systems  on ecommerce websites like Amazon, Flipkart but there are also some more sophisticated systems in space. Netflix suggests videos to watch. TiVo records programs on its own, just in case we’re interested. Pandora builds personalized music streams by predicting what song user is interested in listening. All these enterprises use Recommender Systems to enhance customer experience whenever a user uses their service.

Continue reading “Recommender Systems Simplified”

How Eventual Consistency works ?

How Eventual Consistency works ?

DynamoDB offered by Amazon pioneered the idea of Eventual Consistency as a way to achieve higher availability and scalability.

Big dataset is broken in chunks and these chunks are then sent to different machines. Some replicas of these chunks also sent to these machines to address fault-tolerance. So the two requirements which we need to deal with here are:-

Continue reading “How Eventual Consistency works ?”