• Name :- Avinash Rai
• SID :- 89923
• Subject : - Deep Learning
- ---LAB INDEX---
S. No. Name of Experiment Page No. Faculty Sign..
1 House Price Prediction 1 to 3
2 Graduation Admission Prediction 4 to 6
3 Car Price Prediction 7 to 11
4 Laptop Price Prediction 12 to 15
5 Mobile Price Prediction 16 to 18
6 Employee Salary Analysis 19 to 24 7 Advertisement Analysis’s 25 to 33 8 Credit Card Fraud Detection 34 to 36
9 Heart Failure Analysis 37 to 41
10 Apple Quality Detection 42 to 45
LAB 1
Data Loading
#IMPORTING LIBRARIES import pandas as pd import numpy as np
import matplotlib.pyplot as plt import seaborn as sns
import tensorflow as tf
#IMPORTING DATASET
pd.set_option('display.max_columns',None)
ds = pd.read_csv(r"D:\\projects\\DL labs\\house_data (Lab1).csv") ds.head()
id date price bedrooms bathrooms sqft_living sqft_lot floors waterfront view condition grade sqft_above s
0 7129300520 20141013T000000 221900.0 3 1.00 1180 5650 1.0 0 0 3 7 1180
1 6414100192 20141209T000000 538000.0 3 2.25 2570 7242 2.0 0 0 3 7 2170
2 5631500400 20150225T000000 180000.0 2 1.00 770 10000 1.0 0 0 3 6 770
3 2487200875 20141209T000000 604000.0 4 3.00 1960 5000 1.0 0 0 5 7 1050
4 1954400510 20150218T000000 510000.0 3 2.00 1680 8080 1.0 0 0 3 8 1680
#DATASET HAVE 21613 ROWS AND 21 FEATURES ds.shape
(21613, 21)
#REMOVING UNNECESSARY FEATURES
ds.drop(['id','date'],axis = 1,inplace = True) ds.head()
price bedrooms bathrooms sqft_living sqft_lot floors waterfront view condition grade sqft_above sqft_basement yr_built yr_reno
0 221900.0 3 1.00 1180 5650 1.0 0 0 3 7 1180 0 1955
1 538000.0 3 2.25 2570 7242 2.0 0 0 3 7 2170 400 1951
2 180000.0 2 1.00 770 10000 1.0 0 0 3 6 770 0 1933
3 604000.0 4 3.00 1960 5000 1.0 0 0 5 7 1050 910 1965
4 510000.0 3 2.00 1680 8080 1.0 0 0 3 8 1680 0 1987
#CHECKING FOR NULL VALUES ds.isnull().sum()
price 0 bedrooms 0 bathrooms 0 sqft_living 0 sqft_lot 0 floors 0 waterfront 0 view 0 condition 0 grade 0 sqft_above 0 sqft_basement 0 yr_built 0 yr_renovated 0 zipcode 0 lat 0 long 0 sqft_living15 0 sqft_lot15 0 dtype: int64 NO MISSING DATA
ds.info() In [1]:In [2]:
Out[2]:
In [3]:
Out[3]:
In [4]:
Out[4]:
In [5]:
Out[5]:
In [6]:
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 21613 entries, 0 to 21612 Data columns (total 19 columns):
# Column Non-Null Count Dtype --- --- --- --- 0 price 21613 non-null float64 1 bedrooms 21613 non-null int64 2 bathrooms 21613 non-null float64 3 sqft_living 21613 non-null int64 4 sqft_lot 21613 non-null int64 5 floors 21613 non-null float64 6 waterfront 21613 non-null int64 7 view 21613 non-null int64 8 condition 21613 non-null int64 9 grade 21613 non-null int64 10 sqft_above 21613 non-null int64 11 sqft_basement 21613 non-null int64 12 yr_built 21613 non-null int64 13 yr_renovated 21613 non-null int64 14 zipcode 21613 non-null int64 15 lat 21613 non-null float64 16 long 21613 non-null float64 17 sqft_living15 21613 non-null int64 18 sqft_lot15 21613 non-null int64 dtypes: float64(5), int64(14)
memory usage: 3.1 MB
ds.columnsIndex(['price', 'bedrooms', 'bathrooms', 'sqft_living', 'sqft_lot', 'floors', 'waterfront', 'view', 'condition', 'grade', 'sqft_above',
'sqft_basement', 'yr_built', 'yr_renovated', 'zipcode', 'lat', 'long', 'sqft_living15', 'sqft_lot15'],
dtype='object')
Exploratory Data Analysis (EDA)
corr = ds[['bathrooms', 'bedrooms', 'sqft_living', 'sqft_lot', 'floors', 'grade', 'price']]
plt.figure(figsize=(10,8)) plt.title('Correlation Matrix') sns.heatmap(corr.corr(),annot=True)
<Axes: title={'center': 'Correlation Matrix'}>
#DIVIDING DATASET IN DEPENDENT AND INDEPENDENT VARIABLE x = ds.iloc[:,1:].values
y = ds.iloc[:,0].values
#SPLITTING DATASET IN TRAINING AND TESTING SET from sklearn.model_selection import train_test_split In [7]:
Out[7]:
In [8]:
Out[8]:
In [9]:
In [10]:
x_train,x_test,y_train,y_test = train_test_split(x,y,test_size = 0.2,random_state = 0)
Data Building ANN
Artificial Neural Network
#FEATURE SCALING IS NECESSARY IN ANN
from sklearn.preprocessing import StandardScaler sc = StandardScaler()
x_train = sc.fit_transform(x_train) x_test = sc.transform(x_test)
#CREATING THE ANN AS SEQUENCE OF LAYERS ann = tf.keras.models.Sequential()
#ADDING FIRST HIDDEN LAYER WITH 30 NEURONS, THE INPUT LAYER WILL BE ADDED AUTOMATICALLY, ann.add(tf.keras.layers.Dense(units = 30, activation = 'relu'))
#ADDING 2ND HIDDEN LAYER WITH 30 NEURONS
ann.add(tf.keras.layers.Dense(units = 30, activation = 'relu'))
#ADDING 3RD HIDDEN LAYER WITH 30 NEURONS
ann.add(tf.keras.layers.Dense(units = 30, activation = 'relu'))
#ADDING 4TH HIDDEN LAYER WITH 30 NEURONS
ann.add(tf.keras.layers.Dense(units = 30, activation = 'relu'))
#ADDING OUTPUT LAYER WITH 1 NEURON . ann.add(tf.keras.layers.Dense(units = 1))
#COMPILING THE ANN USING STOCHASTIC GRADIENT DESCENT (optimizer = 'adam') ann.compile(optimizer = 'adam',loss = 'mean_squared_error')
#TRAINING THE ANN WITH BATCH SIZE OF 32 (THIS IS A BATCH LEARNING) ann.fit(x_train,y_train,batch_size = 20,epochs = 10)
Epoch 1/10
865/865 [==============================] - 6s 4ms/step - loss: 220945924096.0000 Epoch 2/10
865/865 [==============================] - 4s 4ms/step - loss: 52462403584.0000 Epoch 3/10
865/865 [==============================] - 4s 4ms/step - loss: 41954840576.0000 Epoch 4/10
865/865 [==============================] - 4s 4ms/step - loss: 37630251008.0000 Epoch 5/10
865/865 [==============================] - 4s 4ms/step - loss: 35848007680.0000 Epoch 6/10
865/865 [==============================] - 2s 2ms/step - loss: 34793172992.0000 Epoch 7/10
865/865 [==============================] - 3s 4ms/step - loss: 34105667584.0000 Epoch 8/10
865/865 [==============================] - 4s 4ms/step - loss: 33591887872.0000 Epoch 9/10
865/865 [==============================] - 4s 4ms/step - loss: 33057228800.0000 Epoch 10/10
865/865 [==============================] - 4s 4ms/step - loss: 32762683392.0000
#COMPARING ACTUAL VALUES WITH PREDICTED VALUES np.set_printoptions(precision=2)
y_pred = ann.predict(x_test)
np.concatenate((y_pred,y_test.reshape(-1,1)),1)
Accuracy and evaluation
#GETTING ACCURACY
from sklearn.metrics import r2_score y_pred = ann.predict(x_test) print(r2_score(y_test,y_pred))
I GOT AN ACCURACY OF AROUND 79% TUNING THE HYPERPARAMETERS MAY INCREASE THE ACCURACY
In [11]:In [12]:
In [13]:
In [14]:
In [15]:
In [16]:
In [17]:
In [18]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
LAB 2 Data Loading
import numpy as np # linear algebra
import pandas as pd # data processing, CSV file I/O (e.g. pd.read_csv) import matplotlib.pyplot as plt
import seaborn as sns import warnings
warnings.filterwarnings('ignore')
df=pd.read_csv(r"C:\Users\gunja\OneDrive\Desktop\6th Sem\DL Lab\Admission (Lab2).csv") df.head(2)
Serial No. GRE Score TOEFL Score University Rating SOP LOR CGPA Research Chance of Admit
0 1 337 118 4 4.5 4.5 9.65 1 0.92
1 2 324 107 4 4.0 4.5 8.87 1 0.76
df.shape (500, 9) df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 500 entries, 0 to 499 Data columns (total 9 columns):
# Column Non-Null Count Dtype --- --- --- --- 0 Serial No. 500 non-null int64 1 GRE Score 500 non-null int64 2 TOEFL Score 500 non-null int64 3 University Rating 500 non-null int64 4 SOP 500 non-null float64 5 LOR 500 non-null float64 6 CGPA 500 non-null float64 7 Research 500 non-null int64 8 Chance of Admit 500 non-null float64 dtypes: float64(4), int64(5)
memory usage: 35.3 KB df.duplicated().sum() 0
df=df.iloc[:,1:]
df
GRE Score TOEFL Score University Rating SOP LOR CGPA Research Chance of Admit
0 337 118 4 4.5 4.5 9.65 1 0.92
1 324 107 4 4.0 4.5 8.87 1 0.76
2 316 104 3 3.0 3.5 8.00 1 0.72
3 322 110 3 3.5 2.5 8.67 1 0.80
4 314 103 2 2.0 3.0 8.21 0 0.65
... ... ... ... ... ... ... ... ...
495 332 108 5 4.5 4.0 9.02 1 0.87
496 337 117 5 5.0 5.0 9.87 1 0.96
497 330 120 5 4.5 5.0 9.56 1 0.93
498 312 103 4 4.0 5.0 8.43 0 0.73
499 327 113 4 4.5 4.5 9.04 0 0.84
500 rows × 8 columns
EDA
corr = df.corr() mask = np.zeros_like(corr)
mask[np.triu_indices_from(mask)] = True with sns.axes_style("white"):
f, ax = plt.subplots(figsize=(9, 7))
ax = sns.heatmap(corr,mask=mask,square=True,annot=True,fmt='0.2f',linewidths=.8,cmap="hsv") In [1]:
In [2]:
In [3]:
In [4]:
Out[4]:
In [5]:
Out[5]:
In [6]:
In [7]:
Out[7]:
In [8]:
Out[8]:
In [21]:
x=df.iloc[:,0:-1]
y=df.iloc[:,-1]
y 0 0.92 1 0.76 2 0.72 3 0.80 4 0.65 ...
495 0.87 496 0.96 497 0.93 498 0.73 499 0.84
Name: Chance of Admit , Length: 500, dtype: float64
# upper bound and lower bound known than use min max scaling from sklearn.model_selection import train_test_split
x_train,x_test,y_train,y_test=train_test_split(x,y,test_size=0.2,random_state=1) from sklearn.preprocessing import MinMaxScaler
scaler=MinMaxScaler()
x_train_scaled=scaler.fit_transform(x_train) x_test_scaled=scaler.transform(x_test) x_train_scaled
array([[0.4 , 0.42857143, 0.5 , ..., 0.57142857, 0.50320513, 0. ],
[0.56 , 0.64285714, 0. , ..., 0.57142857, 0.55769231, 1. ],
[0.2 , 0.32142857, 0.5 , ..., 0.28571429, 0.34615385, 0. ],
...,
[0.7 , 0.53571429, 0.5 , ..., 0.57142857, 0.74038462, 1. ],
[0.72 , 0.67857143, 1. , ..., 0.71428571, 0.77884615, 1. ],
[0.2 , 0.46428571, 0. , ..., 0.14285714, 0.32051282, 0. ]])
Model Building
import tensorflow from tensorflow import keras from keras import Sequential from keras.layers import Dense
WARNING:tensorflow:From C:\Users\gunja\anaconda3\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v 1.losses.sparse_softmax_cross_entropy instead.
model=Sequential()
model.add(Dense(7, activation='relu', input_dim=7)) model.add(Dense(7, activation='relu'))
model.add(Dense(7, activation='relu')) model.add(Dense(7, activation='relu')) model.add(Dense(1,activation='linear'))
WARNING:tensorflow:From C:\Users\gunja\anaconda3\lib\site-packages\keras\src\backend.py:873: The name tf.get_default_graph is deprecated. Please use tf.compat.v1.get_default_grap h instead.
model.summary() In [9]:
Out[9]:
In [10]:
In [11]:
Out[11]:
In [12]:
In [13]:
In [14]:
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 7) 56 dense_1 (Dense) (None, 7) 56 dense_2 (Dense) (None, 7) 56 dense_3 (Dense) (None, 7) 56 dense_4 (Dense) (None, 1) 8
=================================================================
Total params: 232 (928.00 Byte) Trainable params: 232 (928.00 Byte) Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________
model.compile(loss='mean_squared_error', optimizer="Adam")
WARNING:tensorflow:From C:\Users\gunja\anaconda3\lib\site-packages\keras\src\optimizers\__init__.py:309: The name tf.train.Optimizer is deprecated. Please use tf.compat.v1.train.
Optimizer instead.
history=model.fit(x_train_scaled,y_train,epochs=10, validation_split=0.2) Epoch 1/10
WARNING:tensorflow:From C:\Users\gunja\anaconda3\lib\site-packages\keras\src\utils\tf_utils.py:492: The name tf.ragged.RaggedTensorValue is deprecated. Please use tf.compat.v1.ra gged.RaggedTensorValue instead.
10/10 [==============================] - 4s 71ms/step - loss: 0.5200 - val_loss: 0.5547 Epoch 2/10
10/10 [==============================] - 0s 14ms/step - loss: 0.4934 - val_loss: 0.5236 Epoch 3/10
10/10 [==============================] - 0s 14ms/step - loss: 0.4624 - val_loss: 0.4876 Epoch 4/10
10/10 [==============================] - 0s 15ms/step - loss: 0.4263 - val_loss: 0.4476 Epoch 5/10
10/10 [==============================] - 0s 14ms/step - loss: 0.3875 - val_loss: 0.4051 Epoch 6/10
10/10 [==============================] - 0s 14ms/step - loss: 0.3456 - val_loss: 0.3579 Epoch 7/10
10/10 [==============================] - 0s 15ms/step - loss: 0.2997 - val_loss: 0.3067 Epoch 8/10
10/10 [==============================] - 0s 15ms/step - loss: 0.2503 - val_loss: 0.2518 Epoch 9/10
10/10 [==============================] - 0s 15ms/step - loss: 0.1989 - val_loss: 0.1934 Epoch 10/10
10/10 [==============================] - 0s 16ms/step - loss: 0.1459 - val_loss: 0.1339 y_pred=model.predict(x_test_scaled)
4/4 [==============================] - 0s 4ms/step
Accuracy and evaluation
from sklearn.metrics import r2_score r2_score(y_test,y_pred)
-5.097606032550521
import matplotlib.pyplot as plt plt.plot(history.history['loss']) plt.plot(history.history['val_loss']) [<matplotlib.lines.Line2D at 0x179efe3ef50>]
In [15]:
In [16]:
In [17]:
In [18]:
Out[18]:
In [19]:
Out[19]:
LAB-3 Data Loading
import numpy as np # linear algebra
import pandas as pd # data processing, CSV file I/O (e.g. pd.read_csv) import matplotlib.pyplot as plt
import seaborn as sns import warnings
warnings.filterwarnings('ignore')
df = pd.read_csv(r"D:\\projects\\DL labs\\CarPrice (Lab3).csv") df.head(2)
car_ID symboling CarName fueltype aspiration doornumber carbody drivewheel enginelocation wheelbase ... enginesize fuelsystem boreratio stroke compressionratio horsepower peakrpm citympg highwaympg price
0 1 3 alfa-romero
giulia gas std two convertible rwd front 88.6 ... 130 mpfi 3.47 2.68 9.0 111 5000 21 27 13495.0
1 2 3 alfa-romero
stelvio gas std two convertible rwd front 88.6 ... 130 mpfi 3.47 2.68 9.0 111 5000 21 27 16500.0
2 rows × 26 columns
df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 205 entries, 0 to 204 Data columns (total 26 columns):
# Column Non-Null Count Dtype --- --- --- --- 0 car_ID 205 non-null int64 1 symboling 205 non-null int64 2 CarName 205 non-null object 3 fueltype 205 non-null object 4 aspiration 205 non-null object 5 doornumber 205 non-null object 6 carbody 205 non-null object 7 drivewheel 205 non-null object 8 enginelocation 205 non-null object 9 wheelbase 205 non-null float64 10 carlength 205 non-null float64 11 carwidth 205 non-null float64 12 carheight 205 non-null float64 13 curbweight 205 non-null int64 14 enginetype 205 non-null object 15 cylindernumber 205 non-null object 16 enginesize 205 non-null int64 17 fuelsystem 205 non-null object 18 boreratio 205 non-null float64 19 stroke 205 non-null float64 20 compressionratio 205 non-null float64 21 horsepower 205 non-null int64 22 peakrpm 205 non-null int64 23 citympg 205 non-null int64 24 highwaympg 205 non-null int64 25 price 205 non-null float64 dtypes: float64(8), int64(8), object(10) memory usage: 41.8+ KB
df.shape (205, 26) df.head(5)
car_ID symboling CarName fueltype aspiration doornumber carbody drivewheel enginelocation wheelbase ... enginesize fuelsystem boreratio stroke compressionratio horsepower peakrpm citympg highwaympg price
0 1 3 alfa-romero
giulia gas std two convertible rwd front 88.6 ... 130 mpfi 3.47 2.68 9.0 111 5000 21 27 13495.0
1 2 3 alfa-romero
stelvio gas std two convertible rwd front 88.6 ... 130 mpfi 3.47 2.68 9.0 111 5000 21 27 16500.0
2 3 1 alfa-romero
Quadrifoglio gas std two hatchback rwd front 94.5 ... 152 mpfi 2.68 3.47 9.0 154 5000 19 26 16500.0
3 4 2 audi 100 ls gas std four sedan fwd front 99.8 ... 109 mpfi 3.19 3.40 10.0 102 5500 24 30 13950.0
4 5 2 audi 100ls gas std four sedan 4wd front 99.4 ... 136 mpfi 3.19 3.40 8.0 115 5500 18 22 17450.0
5 rows × 26 columns
nan_count = df.isna().sum()
print(nan_count)
car_ID 0
symboling 0
CarName 0
fueltype 0
aspiration 0
doornumber 0
carbody 0
drivewheel 0
enginelocation 0
wheelbase 0
carlength 0
carwidth 0
carheight 0
curbweight 0
enginetype 0
cylindernumber 0
enginesize 0
fuelsystem 0
boreratio 0
stroke 0
compressionratio 0
horsepower 0
peakrpm 0
citympg 0
highwaympg 0
price 0 dtype: int64
print(list(df))
['car_ID', 'symboling', 'CarName', 'fueltype', 'aspiration', 'doornumber', 'carbody', 'drivewheel', 'enginelocation', 'wheelbase', 'carlength', 'carwidth', 'carheight', 'curbweight', 'enginetype', 'cylindernumber', 'enginesiz e', 'fuelsystem', 'boreratio', 'stroke', 'compressionratio', 'horsepower', 'peakrpm', 'citympg', 'highwaympg', 'price']
df.head(2)
car_ID symboling CarName fueltype aspiration doornumber carbody drivewheel enginelocation wheelbase ... enginesize fuelsystem boreratio stroke compressionratio horsepower peakrpm citympg highwaympg price
0 1 3 alfa-romero
giulia gas std two convertible rwd front 88.6 ... 130 mpfi 3.47 2.68 9.0 111 5000 21 27 13495.0
1 2 3 alfa-romero
stelvio gas std two convertible rwd front 88.6 ... 130 mpfi 3.47 2.68 9.0 111 5000 21 27 16500.0
2 rows × 26 columns df.describe() In [1]:
In [2]:
In [3]:
Out[3]:
In [4]:
In [5]:
Out[5]:
In [6]:
Out[6]:
In [7]:
In [8]:
In [9]:
Out[9]:
In [10]:
car_ID symboling wheelbase carlength carwidth carheight curbweight enginesize boreratio stroke compressionratio horsepower peakrpm citympg highwaympg price count 205.000000 205.000000 205.000000 205.000000 205.000000 205.000000 205.000000 205.000000 205.000000 205.000000 205.000000 205.000000 205.000000 205.000000 205.000000 205.000000 mean 103.000000 0.834146 98.756585 174.049268 65.907805 53.724878 2555.565854 126.907317 3.329756 3.255415 10.142537 104.117073 5125.121951 25.219512 30.751220 13276.710571 std 59.322565 1.245307 6.021776 12.337289 2.145204 2.443522 520.680204 41.642693 0.270844 0.313597 3.972040 39.544167 476.985643 6.542142 6.886443 7988.852332 min 1.000000 -2.000000 86.600000 141.100000 60.300000 47.800000 1488.000000 61.000000 2.540000 2.070000 7.000000 48.000000 4150.000000 13.000000 16.000000 5118.000000 25% 52.000000 0.000000 94.500000 166.300000 64.100000 52.000000 2145.000000 97.000000 3.150000 3.110000 8.600000 70.000000 4800.000000 19.000000 25.000000 7788.000000 50% 103.000000 1.000000 97.000000 173.200000 65.500000 54.100000 2414.000000 120.000000 3.310000 3.290000 9.000000 95.000000 5200.000000 24.000000 30.000000 10295.000000 75% 154.000000 2.000000 102.400000 183.100000 66.900000 55.500000 2935.000000 141.000000 3.580000 3.410000 9.400000 116.000000 5500.000000 30.000000 34.000000 16503.000000 max 205.000000 3.000000 120.900000 208.100000 72.300000 59.800000 4066.000000 326.000000 3.940000 4.170000 23.000000 288.000000 6600.000000 49.000000 54.000000 45400.000000
df=df.drop(['car_ID'],axis=1)
EDA
# Visualize the correlation matrix using Seaborn heatmap import matplotlib.pyplot as plt
import seaborn as sns plt.figure(figsize=(10, 8)) mask=np.triu(np.ones_like(Corr_mat))
sns.heatmap(Corr_mat, annot=True, cmap='coolwarm',mask=mask) plt.title('Correlation Matrix')
plt.show()
Relations with price :carlength,carwidth,cubweight,highwaympg,carheight
#Boxplot of all the categorical variables plt.figure(figsize=(50, 45)) plt.subplot(3,3,1)
sns.boxplot(x = 'doornumber', y = 'price', data = df)
# plt.subplot(3,3,2)
# sns.boxplot(x = 'fueltype', y = 'price', data = df)
# plt.subplot(3,3,3)
# sns.boxplot(x = 'aspiration', y = 'price', data = df) plt.show()
Out[10]:
In [12]:
In [25]:
In [26]:
numerical_columns_without_price_carID = [col for col in numerical_columns if col not in ['price', 'car_ID']]
X = df[['carlength','carwidth','curbweight','carheight','highwaympg']]
y=df['price']
X
carlength carwidth curbweight carheight highwaympg
0 168.8 64.1 2548 48.8 27
1 168.8 64.1 2548 48.8 27
2 171.2 65.5 2823 52.4 26
3 176.6 66.2 2337 54.3 30
4 176.6 66.4 2824 54.3 22
... ... ... ... ... ...
200 188.8 68.9 2952 55.5 28
201 188.8 68.8 3049 55.5 25
202 188.8 68.9 3012 55.5 23
203 188.8 68.9 3217 55.5 27
204 188.8 68.9 3062 55.5 25
205 rows × 5 columns Using Keras import tensorflow from tensorflow import keras from tensorflow.keras import Sequential from tensorflow.keras.layers import Dense from tensorflow.keras.layers import Dropout from keras import regularizers from keras.layers import BatchNormalization
2024-02-29 18:28:23.497298: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:9261] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered 2024-02-29 18:28:23.497440: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:607] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered 2024-02-29 18:28:23.665785: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1515] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
Model Building
model = Sequential()
model.add(Dense(10, activation='relu', input_dim=X_train.shape[1])) model.add(Dense(10, activation='relu'))
model.add(Dense(1, activation='linear'))
model.summary() Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 10) 60 dense_1 (Dense) (None, 10) 110 dense_2 (Dense) (None, 1) 11
=================================================================
Total params: 181 (724.00 Byte) Trainable params: 181 (724.00 Byte) Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________
from tensorflow.keras.optimizers import SGD,Adam
# sgd = SGD(lr=0.0009) lr is the learning rate, momentum is an optional parameter model.compile(optimizer="Adam",loss='mean_squared_error')
history = model.fit(X_train,y_train,batch_size=10,epochs=100,verbose=1,validation_split=0.2) In [27]:
In [28]:
Out[28]:
In [34]:
In [35]:
In [36]:
In [37]:
In [38]:
In [39]:
Epoch 1/100
15/15 [==============================] - 1s 16ms/step - loss: 1.1496 - val_loss: 0.7470 Epoch 2/100
15/15 [==============================] - 0s 4ms/step - loss: 0.9650 - val_loss: 0.6362 Epoch 3/100
15/15 [==============================] - 0s 5ms/step - loss: 0.8374 - val_loss: 0.5473 Epoch 4/100
15/15 [==============================] - 0s 4ms/step - loss: 0.7315 - val_loss: 0.4755 Epoch 5/100
15/15 [==============================] - 0s 4ms/step - loss: 0.6460 - val_loss: 0.4188 Epoch 6/100
15/15 [==============================] - 0s 4ms/step - loss: 0.5725 - val_loss: 0.3671 Epoch 7/100
15/15 [==============================] - 0s 4ms/step - loss: 0.5107 - val_loss: 0.3207 Epoch 8/100
15/15 [==============================] - 0s 5ms/step - loss: 0.4537 - val_loss: 0.2851 Epoch 9/100
15/15 [==============================] - 0s 4ms/step - loss: 0.4087 - val_loss: 0.2527 Epoch 10/100
15/15 [==============================] - 0s 4ms/step - loss: 0.3718 - val_loss: 0.2267 Epoch 11/100
15/15 [==============================] - 0s 4ms/step - loss: 0.3418 - val_loss: 0.2093 Epoch 12/100
15/15 [==============================] - 0s 4ms/step - loss: 0.3234 - val_loss: 0.1965 Epoch 13/100
15/15 [==============================] - 0s 4ms/step - loss: 0.3084 - val_loss: 0.1883 Epoch 14/100
15/15 [==============================] - 0s 4ms/step - loss: 0.3006 - val_loss: 0.1899 Epoch 15/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2931 - val_loss: 0.1821 Epoch 16/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2884 - val_loss: 0.1778 Epoch 17/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2847 - val_loss: 0.1683 Epoch 18/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2811 - val_loss: 0.1657 Epoch 19/100
15/15 [==============================] - 0s 6ms/step - loss: 0.2795 - val_loss: 0.1626 Epoch 20/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2767 - val_loss: 0.1582 Epoch 21/100
15/15 [==============================] - 0s 6ms/step - loss: 0.2749 - val_loss: 0.1539 Epoch 22/100
15/15 [==============================] - 0s 6ms/step - loss: 0.2730 - val_loss: 0.1520 Epoch 23/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2720 - val_loss: 0.1543 Epoch 24/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2710 - val_loss: 0.1461 Epoch 25/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2708 - val_loss: 0.1423 Epoch 26/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2700 - val_loss: 0.1541 Epoch 27/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2670 - val_loss: 0.1463 Epoch 28/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2657 - val_loss: 0.1458 Epoch 29/100
15/15 [==============================] - 0s 6ms/step - loss: 0.2648 - val_loss: 0.1400 Epoch 30/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2637 - val_loss: 0.1412 Epoch 31/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2636 - val_loss: 0.1452 Epoch 32/100
15/15 [==============================] - 0s 6ms/step - loss: 0.2619 - val_loss: 0.1392 Epoch 33/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2636 - val_loss: 0.1435 Epoch 34/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2613 - val_loss: 0.1339 Epoch 35/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2626 - val_loss: 0.1410 Epoch 36/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2601 - val_loss: 0.1320 Epoch 37/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2590 - val_loss: 0.1313 Epoch 38/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2582 - val_loss: 0.1387 Epoch 39/100
15/15 [==============================] - 0s 6ms/step - loss: 0.2577 - val_loss: 0.1397 Epoch 40/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2560 - val_loss: 0.1314 Epoch 41/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2567 - val_loss: 0.1333 Epoch 42/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2550 - val_loss: 0.1333 Epoch 43/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2552 - val_loss: 0.1345 Epoch 44/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2543 - val_loss: 0.1258 Epoch 45/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2526 - val_loss: 0.1293 Epoch 46/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2525 - val_loss: 0.1349 Epoch 47/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2521 - val_loss: 0.1318 Epoch 48/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2520 - val_loss: 0.1368 Epoch 49/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2511 - val_loss: 0.1346 Epoch 50/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2500 - val_loss: 0.1282 Epoch 51/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2492 - val_loss: 0.1277 Epoch 52/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2493 - val_loss: 0.1368 Epoch 53/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2483 - val_loss: 0.1327 Epoch 54/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2474 - val_loss: 0.1291 Epoch 55/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2470 - val_loss: 0.1298 Epoch 56/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2470 - val_loss: 0.1339 Epoch 57/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2456 - val_loss: 0.1314 Epoch 58/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2455 - val_loss: 0.1347 Epoch 59/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2447 - val_loss: 0.1321 Epoch 60/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2438 - val_loss: 0.1324 Epoch 61/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2436 - val_loss: 0.1324 Epoch 62/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2434 - val_loss: 0.1352 Epoch 63/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2419 - val_loss: 0.1297 Epoch 64/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2416 - val_loss: 0.1283 Epoch 65/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2414 - val_loss: 0.1300 Epoch 66/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2412 - val_loss: 0.1342 Epoch 67/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2415 - val_loss: 0.1276 Epoch 68/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2402 - val_loss: 0.1312 Epoch 69/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2396 - val_loss: 0.1342 Epoch 70/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2387 - val_loss: 0.1349 Epoch 71/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2400 - val_loss: 0.1294 Epoch 72/100
15/15 [==============================] - 0s 3ms/step - loss: 0.2418 - val_loss: 0.1411 Epoch 73/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2374 - val_loss: 0.1325 Epoch 74/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2374 - val_loss: 0.1314 Epoch 75/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2357 - val_loss: 0.1381 Epoch 76/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2358 - val_loss: 0.1375 Epoch 77/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2355 - val_loss: 0.1330 Epoch 78/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2352 - val_loss: 0.1369 Epoch 79/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2341 - val_loss: 0.1349 Epoch 80/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2347 - val_loss: 0.1325 Epoch 81/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2336 - val_loss: 0.1367 Epoch 82/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2337 - val_loss: 0.1324 Epoch 83/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2337 - val_loss: 0.1295 Epoch 84/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2325 - val_loss: 0.1400 Epoch 85/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2346 - val_loss: 0.1478 Epoch 86/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2308 - val_loss: 0.1353 Epoch 87/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2311 - val_loss: 0.1366 Epoch 88/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2307 - val_loss: 0.1337 Epoch 89/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2328 - val_loss: 0.1305 Epoch 90/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2298 - val_loss: 0.1362 Epoch 91/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2292 - val_loss: 0.1423 Epoch 92/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2285 - val_loss: 0.1368 Epoch 93/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2276 - val_loss: 0.1361 Epoch 94/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2284 - val_loss: 0.1345 Epoch 95/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2269 - val_loss: 0.1353 Epoch 96/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2267 - val_loss: 0.1372 Epoch 97/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2255 - val_loss: 0.1359 Epoch 98/100
15/15 [==============================] - 0s 5ms/step - loss: 0.2251 - val_loss: 0.1364 Epoch 99/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2248 - val_loss: 0.1359 Epoch 100/100
15/15 [==============================] - 0s 4ms/step - loss: 0.2232 - val_loss: 0.1385 y_pred = model.predict(X_test)
1/1 [==============================] - 0s 93ms/step
Accuracy and evaluation
from sklearn.metrics import r2_score r2_score(y_test,y_pred) 0.7284987359245992 plt.plot(history.history['loss']) plt.plot(history.history['val_loss']) [<matplotlib.lines.Line2D at 0x7fa37dede7a0>]
In [40]:
In [ ]:
In [41]:
Out[41]:
In [42]:
Out[42]:
Lab-4 Loading Data**
import pandas as pd import numpy as np import matplotlib.pyplot as plt import seaborn as sns import tensorflow as tf import warnings
warnings.filterwarnings('ignore')
df = pd.read_csv(r"D:\\projects\\DL labs\\laptop_data (Lab4).csv") df.head()
brand processor_brand processor_name processor_gnrtn ram_gb ram_type ssd hdd os os_bit graphic_card_gb weight warranty Touchscreen msoffice Price rating Number of Ratings Number of Reviews
0 ASUS Intel Core i3 10th 4 GB DDR4 0 GB 1024 GB Windows 64-bit 0 GB Casual No warranty No No 34649 2 stars 3 0
1 Lenovo Intel Core i3 10th 4 GB DDR4 0 GB 1024 GB Windows 64-bit 0 GB Casual No warranty No No 38999 3 stars 65 5
2 Lenovo Intel Core i3 10th 4 GB DDR4 0 GB 1024 GB Windows 64-bit 0 GB Casual No warranty No No 39999 3 stars 8 1
3 ASUS Intel Core i5 10th 8 GB DDR4 512 GB 0 GB Windows 32-bit 2 GB Casual No warranty No No 69990 3 stars 0 0
4 ASUS Intel Celeron Dual Not Available 4 GB DDR4 0 GB 512 GB Windows 64-bit 0 GB Casual No warranty No No 26990 3 stars 0 0
print(df.shape) (823, 19)
df.isna().sum()
brand 0
processor_brand 0
processor_name 0
processor_gnrtn 0
ram_gb 0
ram_type 0
ssd 0
hdd 0
os 0
os_bit 0
graphic_card_gb 0
weight 0
warranty 0
Touchscreen 0
msoffice 0
Price 0
rating 0
Number of Ratings 0
Number of Reviews 0
dtype: int64 df.duplicated().sum() 21 df.info() <class 'pandas.core.frame.DataFrame'> Int64Index: 802 entries, 0 to 822 Data columns (total 19 columns): # Column Non-Null Count Dtype --- --- --- --- 0 brand 802 non-null object 1 processor_brand 802 non-null object 2 processor_name 802 non-null object 3 processor_gnrtn 802 non-null object 4 ram_gb 802 non-null object 5 ram_type 802 non-null object 6 ssd 802 non-null object 7 hdd 802 non-null object 8 os 802 non-null object 9 os_bit 802 non-null object 10 graphic_card_gb 802 non-null object 11 weight 802 non-null object 12 warranty 802 non-null object 13 Touchscreen 802 non-null object 14 msoffice 802 non-null object 15 Price 802 non-null int64 16 rating 802 non-null object 17 Number of Ratings 802 non-null int64 18 Number of Reviews 802 non-null int64 dtypes: int64(3), object(16) memory usage: 125.3+ KB df.nunique() brand 8
processor_brand 3
processor_name 11
processor_gnrtn 8
ram_gb 4
ram_type 6
ssd 7
hdd 4
os 3
os_bit 2
graphic_card_gb 5
weight 3
warranty 4
Touchscreen 2
msoffice 2
Price 405
rating 5
Number of Ratings 282
Number of Reviews 135 dtype: int64 df.describe()
Price Number of Ratings Number of Reviews count 802.000000 802.00000 802.000000
mean 76625.543641 299.84414 36.089776
std 45232.984422 1001.78442 118.313553
min 16990.000000 0.00000 0.000000
25% 45990.000000 0.00000 0.000000
50% 63990.000000 17.00000 2.000000
75% 89525.000000 140.25000 18.000000
max 441990.000000 15279.00000 1947.000000
EDA
3.1 Histograms
fig, axs = plt.subplots(1, 2, figsize=(15, 7)) plt.subplot(121)
sns.histplot(data=df, x= 'Price', bins=30, kde=True, color='g') plt.subplot(122)
sns.histplot(data=df, x= 'Price', bins=30, kde=True, color='g', hue= 'rating') plt.show()
In [1]:
In [2]:
In [3]:
Out[3]:
In [4]:
In [5]:
Out[5]:
In [6]:
Out[6]:
In [8]:
In [9]:
Out[9]:
In [10]:
Out[10]:
In [12]:
fig, axs = plt.subplots(1, 2, figsize=(15, 7)) plt.subplot(121)
sns.histplot(data=df,x='Number of Ratings',bins=10,kde=True,color='g') plt.subplot(122)
sns.histplot(data=df,x='Number of Reviews',kde=True, bins = 10) plt.show()
f,ax=plt.subplots(1,2,figsize=(25,10)) Group_data = df.groupby('brand')
sns.barplot(x = Group_data ['Price'].mean().index, y = Group_data['Price'].mean().values,ax= ax[0], palette = 'mako') for container in ax[0].containers:
ax[0].bar_label(container,color='black',size=20) ax[0].set_xlabel("Brand")
ax[0].set_ylabel("Price")
palette_color = sns.color_palette('summer')
plt.pie(x = df['brand'].value_counts(),labels=df['brand'].value_counts().index,autopct='%.0f%%',shadow=True, colors= palette_color) plt.show()
class LogScaling(BaseEstimator, TransformerMixin):
def fit(self, X):
return self def transform(self, X):
return np.log1p(X) X = df.drop('Price', axis = 1) y = df.Price
X_train, X_test, y_train, y_test = train_test_split(X,y,test_size=0.1,random_state= 1) X_train.shape, X_test.shape
((474, 18), (53, 18)) In [13]:
In [16]:
In [25]:
In [27]:
In [28]:
Out[28]:
preprocessor = TransformationPipeline().preprocess() X_train = preprocessor.fit_transform(X_train) X_test = preprocessor.transform(X_test)
Model Building
models = { 'ridge' : Ridge(), 'xgboost' : XGBRegressor(), 'catboost' : CatBoostRegressor(verbose=0), 'lightgbm' : LGBMRegressor(),
'gradient boosting' : GradientBoostingRegressor(), 'lasso' : Lasso(),
'random forest' : RandomForestRegressor(), 'support vector': SVR(),
'ada boost regressor': AdaBoostRegressor() }
for name, model in models.items():
model.fit(X_train, y_train) print(f'{name} trained') ridge trained
xgboost trained catboost trained lightgbm trained gradient boosting trained lasso trained random forest trained support vector trained ada boost regressor trained
model = tf.keras.Sequential() model.add(tf.keras.layers.Dense(79)) model.add(tf.keras.layers.Dense(200)) model.add(tf.keras.layers.Dense(200)) model.add(tf.keras.layers.Dense(200)) model.add(tf.keras.layers.Dense(1)) model.compile(
loss = 'mse',
optimizer = tf.keras.optimizers.Adam(),
metrics = [tf.keras.metrics.RootMeanSquaredError(name='rmse')]
)
model.fit(X_train.toarray(), y_train, validation_data=(X_test.toarray(), y_test), epochs=10) Epoch 1/50
15/15 [==============================] - 2s 20ms/step - loss: 6034421248.0000 - rmse: 77681.5391 - val_loss: 6234427392.0000 - val_rmse: 78958.3906 Epoch 2/50
15/15 [==============================] - 0s 5ms/step - loss: 5774142976.0000 - rmse: 75987.7812 - val_loss: 5475202560.0000 - val_rmse: 73994.6094 Epoch 3/50
15/15 [==============================] - 0s 5ms/step - loss: 3929632768.0000 - rmse: 62686.7812 - val_loss: 1857179008.0000 - val_rmse: 43095.0000 Epoch 4/50
15/15 [==============================] - 0s 5ms/step - loss: 1160814080.0000 - rmse: 34070.7227 - val_loss: 788391616.0000 - val_rmse: 28078.3125 Epoch 5/50
15/15 [==============================] - 0s 5ms/step - loss: 748550784.0000 - rmse: 27359.6562 - val_loss: 618806656.0000 - val_rmse: 24875.8242 Epoch 6/50
15/15 [==============================] - 0s 5ms/step - loss: 597152704.0000 - rmse: 24436.7090 - val_loss: 447594048.0000 - val_rmse: 21156.4180 Epoch 7/50
15/15 [==============================] - 0s 5ms/step - loss: 474842368.0000 - rmse: 21790.8789 - val_loss: 356576736.0000 - val_rmse: 18883.2402 Epoch 8/50
15/15 [==============================] - 0s 6ms/step - loss: 386358528.0000 - rmse: 19656.0059 - val_loss: 287757568.0000 - val_rmse: 16963.4180 Epoch 9/50
15/15 [==============================] - 0s 5ms/step - loss: 326689312.0000 - rmse: 18074.5488 - val_loss: 234908032.0000 - val_rmse: 15326.7100 Epoch 10/50
15/15 [==============================] - 0s 5ms/step - loss: 280824512.0000 - rmse: 16757.8203 - val_loss: 196232848.0000 - val_rmse: 14008.3135 Epoch 11/50
15/15 [==============================] - 0s 5ms/step - loss: 250291936.0000 - rmse: 15820.6172 - val_loss: 182136064.0000 - val_rmse: 13495.7793 Epoch 12/50
15/15 [==============================] - 0s 5ms/step - loss: 227414048.0000 - rmse: 15080.2539 - val_loss: 171958544.0000 - val_rmse: 13113.2969 Epoch 13/50
15/15 [==============================] - 0s 5ms/step - loss: 208420736.0000 - rmse: 14436.7842 - val_loss: 150399504.0000 - val_rmse: 12263.7471 Epoch 14/50
15/15 [==============================] - 0s 5ms/step - loss: 196462464.0000 - rmse: 14016.5068 - val_loss: 162332496.0000 - val_rmse: 12740.9766 Epoch 15/50
15/15 [==============================] - 0s 5ms/step - loss: 187526544.0000 - rmse: 13694.0332 - val_loss: 136902528.0000 - val_rmse: 11700.5352 Epoch 16/50
15/15 [==============================] - 0s 5ms/step - loss: 181739216.0000 - rmse: 13481.0684 - val_loss: 138503104.0000 - val_rmse: 11768.7344 Epoch 17/50
15/15 [==============================] - 0s 5ms/step - loss: 173260416.0000 - rmse: 13162.8418 - val_loss: 132641848.0000 - val_rmse: 11517.0244 Epoch 18/50
15/15 [==============================] - 0s 5ms/step - loss: 169793728.0000 - rmse: 13030.4922 - val_loss: 132304008.0000 - val_rmse: 11502.3477 Epoch 19/50
15/15 [==============================] - 0s 5ms/step - loss: 165738240.0000 - rmse: 12873.9365 - val_loss: 141604048.0000 - val_rmse: 11899.7500 Epoch 20/50
15/15 [==============================] - 0s 5ms/step - loss: 160626368.0000 - rmse: 12673.8457 - val_loss: 126371672.0000 - val_rmse: 11241.5156 Epoch 21/50
15/15 [==============================] - 0s 5ms/step - loss: 158771104.0000 - rmse: 12600.4404 - val_loss: 137791888.0000 - val_rmse: 11738.4785 Epoch 22/50
15/15 [==============================] - 0s 5ms/step - loss: 156332720.0000 - rmse: 12503.3086 - val_loss: 128175608.0000 - val_rmse: 11321.4668 Epoch 23/50
15/15 [==============================] - 0s 5ms/step - loss: 152731296.0000 - rmse: 12358.4502 - val_loss: 137310656.0000 - val_rmse: 11717.9629 Epoch 24/50
15/15 [==============================] - 0s 5ms/step - loss: 153770576.0000 - rmse: 12400.4268 - val_loss: 121157432.0000 - val_rmse: 11007.1533 Epoch 25/50
15/15 [==============================] - 0s 5ms/step - loss: 152454816.0000 - rmse: 12347.2598 - val_loss: 125706024.0000 - val_rmse: 11211.8701 Epoch 26/50
15/15 [==============================] - 0s 5ms/step - loss: 148866704.0000 - rmse: 12201.0947 - val_loss: 135644288.0000 - val_rmse: 11646.6426 Epoch 27/50
15/15 [==============================] - 0s 5ms/step - loss: 150559616.0000 - rmse: 12270.2734 - val_loss: 131070976.0000 - val_rmse: 11448.6230 Epoch 28/50
15/15 [==============================] - 0s 5ms/step - loss: 149590688.0000 - rmse: 12230.7275 - val_loss: 129704600.0000 - val_rmse: 11388.7930 Epoch 29/50
15/15 [==============================] - 0s 7ms/step - loss: 145915072.0000 - rmse: 12079.5312 - val_loss: 122332960.0000 - val_rmse: 11060.4238 Epoch 30/50
15/15 [==============================] - 0s 9ms/step - loss: 146382480.0000 - rmse: 12098.8623 - val_loss: 119537144.0000 - val_rmse: 10933.3047 Epoch 31/50
15/15 [==============================] - 0s 5ms/step - loss: 143537600.0000 - rmse: 11980.7178 - val_loss: 151919040.0000 - val_rmse: 12325.5439 Epoch 32/50
15/15 [==============================] - 0s 5ms/step - loss: 152521200.0000 - rmse: 12349.9473 - val_loss: 134938816.0000 - val_rmse: 11616.3164 Epoch 33/50
15/15 [==============================] - 0s 5ms/step - loss: 146218640.0000 - rmse: 12092.0898 - val_loss: 121597448.0000 - val_rmse: 11027.1230 Epoch 34/50
15/15 [==============================] - 0s 5ms/step - loss: 142932032.0000 - rmse: 11955.4189 - val_loss: 117592840.0000 - val_rmse: 10844.0234 Epoch 35/50
15/15 [==============================] - 0s 5ms/step - loss: 145713008.0000 - rmse: 12071.1641 - val_loss: 131047104.0000 - val_rmse: 11447.5811 Epoch 36/50
15/15 [==============================] - 0s 5ms/step - loss: 143895232.0000 - rmse: 11995.6338 - val_loss: 123311584.0000 - val_rmse: 11104.5752 Epoch 37/50
15/15 [==============================] - 0s 5ms/step - loss: 141400208.0000 - rmse: 11891.1816 - val_loss: 124178488.0000 - val_rmse: 11143.5400 Epoch 38/50
15/15 [==============================] - 0s 5ms/step - loss: 142977744.0000 - rmse: 11957.3301 - val_loss: 137996064.0000 - val_rmse: 11747.1729 Epoch 39/50
15/15 [==============================] - 0s 5ms/step - loss: 144870944.0000 - rmse: 12036.2344 - val_loss: 118785816.0000 - val_rmse: 10898.8906 Epoch 40/50
15/15 [==============================] - 0s 5ms/step - loss: 142753216.0000 - rmse: 11947.9375 - val_loss: 112753264.0000 - val_rmse: 10618.5332 Epoch 41/50
15/15 [==============================] - 0s 5ms/step - loss: 142975392.0000 - rmse: 11957.2314 - val_loss: 122126944.0000 - val_rmse: 11051.1064 Epoch 42/50
15/15 [==============================] - 0s 5ms/step - loss: 139251760.0000 - rmse: 11800.4980 - val_loss: 133205128.0000 - val_rmse: 11541.4521 Epoch 43/50
15/15 [==============================] - 0s 5ms/step - loss: 142358032.0000 - rmse: 11931.3887 - val_loss: 113410872.0000 - val_rmse: 10649.4541 Epoch 44/50
15/15 [==============================] - 0s 5ms/step - loss: 140888160.0000 - rmse: 11869.6318 - val_loss: 118865976.0000 - val_rmse: 10902.5674 Epoch 45/50
15/15 [==============================] - 0s 5ms/step - loss: 141266640.0000 - rmse: 11885.5645 - val_loss: 116452768.0000 - val_rmse: 10791.3281 Epoch 46/50
15/15 [==============================] - 0s 5ms/step - loss: 139959888.0000 - rmse: 11830.4648 - val_loss: 126211768.0000 - val_rmse: 11234.4004 Epoch 47/50
15/15 [==============================] - 0s 5ms/step - loss: 139866128.0000 - rmse: 11826.5010 - val_loss: 121329816.0000 - val_rmse: 11014.9814 Epoch 48/50
15/15 [==============================] - 0s 5ms/step - loss: 141024816.0000 - rmse: 11875.3867 - val_loss: 160072224.0000 - val_rmse: 12651.9658 Epoch 49/50
15/15 [==============================] - 0s 5ms/step - loss: 141752000.0000 - rmse: 11905.9648 - val_loss: 125606624.0000 - val_rmse: 11207.4365 Epoch 50/50
15/15 [==============================] - 0s 5ms/step - loss: 139299664.0000 - rmse: 11802.5283 - val_loss: 126747960.0000 - val_rmse: 11258.2393 In [29]:
In [30]:
In [31]:
In [32]:
<keras.callbacks.History at 0x7cf614465540>
RMSE_ann = model.evaluate(X_test.toarray(), y_test)[1]
RMSE_ann
2/2 [==============================] - 0s 5ms/step - loss: 126747960.0000 - rmse: 11258.2393 11258.2392578125
Accuracy and evaluation
print(f'RMSE: {np.sqrt(mean_squared_error(y_test, final_predictions))}') print(f'R-square: {r2_score(y_test, final_predictions)}') RMSE: 12058.878884898246
R-square: 0.8240078900192354
# Distribution of error plt.figure(figsize = (10, 6))
sns.histplot(y_test - final_predictions, color = '#005b96', kde= True) plt.xlabel('Error');
Out[32]:
In [33]:
Out[33]:
In [39]:
In [41]:
LAB-5 Load Data
import pandas as pd import numpy as np
import matplotlib.pyplot as plt import seaborn as sns
%matplotlib inline
dataset=pd.read_csv(r"D:\projects\DL labs\MobilePrice(Lab5).csv") dataset.head(2)
battery_power blue clock_speed dual_sim fc four_g int_memory m_dep mobile_wt n_cores ... px_height px_width ram sc_h sc_w talk_time three_g touch_screen wifi price_rang
0 842 0 2.2 0 1 0 7 0.6 188 2 ... 20 756 2549 9 7 19 0 0 1
1 1021 1 0.5 1 0 1 53 0.7 136 3 ... 905 1988 2631 17 3 7 1 1 0
2 rows × 21 columns
dataset.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 2000 entries, 0 to 1999 Data columns (total 21 columns):
# Column Non-Null Count Dtype --- --- --- --- 0 battery_power 2000 non-null int64 1 blue 2000 non-null int64 2 clock_speed 2000 non-null float64 3 dual_sim 2000 non-null int64 4 fc 2000 non-null int64 5 four_g 2000 non-null int64 6 int_memory 2000 non-null int64 7 m_dep 2000 non-null float64 8 mobile_wt 2000 non-null int64 9 n_cores 2000 non-null int64 10 pc 2000 non-null int64 11 px_height 2000 non-null int64 12 px_width 2000 non-null int64 13 ram 2000 non-null int64 14 sc_h 2000 non-null int64 15 sc_w 2000 non-null int64 16 talk_time 2000 non-null int64 17 three_g 2000 non-null int64 18 touch_screen 2000 non-null int64 19 wifi 2000 non-null int64 20 price_range 2000 non-null int64 dtypes: float64(2), int64(19)
memory usage: 328.2 KB dataset.describe()
battery_power blue clock_speed dual_sim fc four_g int_memory m_dep mobile_wt n_cores ... px_height px_width ram sc_h count 2000.000000 2000.0000 2000.000000 2000.000000 2000.000000 2000.000000 2000.000000 2000.000000 2000.000000 2000.000000 ... 2000.000000 2000.000000 2000.000000 2000.000000 mean 1238.518500 0.4950 1.522250 0.509500 4.309500 0.521500 32.046500 0.501750 140.249000 4.520500 ... 645.108000 1251.515500 2124.213000 12.306500 std 439.418206 0.5001 0.816004 0.500035 4.341444 0.499662 18.145715 0.288416 35.399655 2.287837 ... 443.780811 432.199447 1084.732044 4.213245 min 501.000000 0.0000 0.500000 0.000000 0.000000 0.000000 2.000000 0.100000 80.000000 1.000000 ... 0.000000 500.000000 256.000000 5.000000 25% 851.750000 0.0000 0.700000 0.000000 1.000000 0.000000 16.000000 0.200000 109.000000 3.000000 ... 282.750000 874.750000 1207.500000 9.000000 50% 1226.000000 0.0000 1.500000 1.000000 3.000000 1.000000 32.000000 0.500000 141.000000 4.000000 ... 564.000000 1247.000000 2146.500000 12.000000 75% 1615.250000 1.0000 2.200000 1.000000 7.000000 1.000000 48.000000 0.800000 170.000000 7.000000 ... 947.250000 1633.000000 3064.500000 16.000000 max 1998.000000 1.0000 3.000000 1