Skip to content
This repository has been archived by the owner on Jul 7, 2024. It is now read-only.

MusaChowdhury/Approximating-Continuous-Function-With-ReLU

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Overview

This repository contains a Jupyter Notebook demonstrating the approximation of continuous functions using Rectified Linear Units (ReLU).
The notebook explores how ReLU activation functions, commonly used in neural networks, can be used to approximate continuous mathematical functions.
It includes detailed visualizations and code examples to illustrate the concepts.

Example

The following images show function approximations using multiple stacked ReLUs at different intervals.

a

b

c

d

About

ReLU's Function Approximation

Topics

Resources

License

Stars

Watchers

Forks