K-nearest neighbor is a classification algorithm which is used to do classification in machine learning. In the field of machine learning k-nearest neighbor is one of the easiest classifier which classify with the technique of radius calculation. A Radius calculation technique shows that a variable increases its diameter and cover the data point of classification, Which classification lies most in that circle will be the return value of the classifier.
I know you may think it sounds tough, but let me tell you K-nearest neighbor is one of the easiest classification in machine learning. To make a K-nearest neighbor classifier by yourself using python. you will need to install a module “scikit-learn” You can install it using:
pip install scikit-learn
pip install sklearn
This is how you can install sklearn module, ensure installing both the modules.
Now if you have installed sklearn and scikit-learn module follow the steps below, We are starting to write our program from now, So first of all let me tell you that we are going to use “iris” dataset, Iris is a flower. Here in the Iris dataset it has many dataset with their classes for classification. So by training our model to the iris data it will be capable of recognizing a flower that is it Iris or not. You can read the whole dataset from here: https://archive.ics.uci.edu/dataset/53/iris
Now lets understand how many parameters are there in the dataset and how many classes are there in the dataset.
Parameters in The Dataset
- Sepal length in cm
- Sepal width in cm
- Petal length in cm
- Petal width in cm
There are four parameters in the dataset.
Classes in the Dataset
- Iris-Setosa [0]
- Iris-Versicolour [1]
- Iris-Virginica [2]
Classes are just classification values which says if classifer returns 0, it means that the given flower is Iris-setosa, if 1 then Iris Versicolour and if 2 then Iris verginica.
K-nearest Neighbor Classifier Code Starts here
from sklearn import datasets
from sklearn.neighbors import KNeighborsClassifier
iris = datasets.load_iris()
# print(iris.DESCR)
features = iris.data
labels = iris.target
# print(features[0], labels[0])
# Training Classifier
clf = KNeighborsClassifier()
clf.fit(features, labels)
# Giving Four Parameters
'''
- sepal length in cm
- sepal width in cm
- petal length in cm
- petal width in cm
'''
preds = clf.predict([[1, 1, 1, 1]])
'''
- Iris-Setosa [0]
- Iris-Versicolour [1]
- Iris-Virginica [2]
'''
print(preds)
How this program works (Explanation)
Step 1: Importing Libraries
from sklearn import datasets
from sklearn.neighbors import KNeighborsClassifier
We import datasets
from sklearn
to load the Iris dataset (a famous dataset with flower measurements).
We import KNeighborsClassifier
to use the K-Nearest Neighbors (KNN) algorithm for classification.
Step 2: Loading the Iris Dataset
iris = datasets.load_iris()
This loads the Iris dataset, which contains measurements of three different types of iris flowers.
Step 3: Extracting Features & Labels
features = iris.data
labels = iris.target
features
contains the flower measurements (sepal length, sepal width, petal length, petal width).
labels
contains the flower type (0 = Setosa, 1 = Versicolour, 2 = Virginica).
Step 4: Training the KNN Classifier
clf = KNeighborsClassifier()
clf.fit(features, labels)
We create a KNeighborsClassifier()
model.
fit(features, labels)
trains the model using the Iris dataset.
Step 5: Making a Prediction
preds = clf.predict([[1, 1, 1, 1]])
We give four values (sepal length, sepal width, petal length, petal width).
The model predicts which type of iris flower matches these measurements.
Step 6: Understanding the Output
print(preds)
The output will be 0, 1, or 2:
0
→ Iris-Setosa1
→ Iris-Versicolour2
→ Iris-Virginica
This is a basic Machine Learning model that classifies flowers based on measurements using the K-Nearest Neighbors (KNN) algorithm. You can change the input values to see how it predicts different flowers!

This is all about K-nearest neighbor classifer using python, hope you liked it, if yes then share it with your friends and collogues and also leave a beautiful comment for supporting me. Thank you for reading Article.