Skip to content

Commit

Permalink
updates
Browse files Browse the repository at this point in the history
Signed-off-by: Raj Patil <rajp152k@gmail.com>
  • Loading branch information
rajp152k committed Jan 27, 2025
1 parent e3307c9 commit 853407e
Show file tree
Hide file tree
Showing 13 changed files with 256 additions and 5 deletions.
5 changes: 2 additions & 3 deletions Content/20240205171209-go.org
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@

* Org Babel Base
#+begin_src go :exports both
package main

import (
"fmt"
)
Expand All @@ -16,9 +18,6 @@ func main(){
}
#+end_src

#+RESULTS:
: org babel base

* Stream
** 0x22B2
- post theGoPL, will finish of TDD with Go
Expand Down
15 changes: 15 additions & 0 deletions Content/20241115141952-mathematical_induction.org
Original file line number Diff line number Diff line change
Expand Up @@ -3,3 +3,18 @@
:END:
#+title: mathematical induction
#+filetags: :math:

* Overview

- *Definition*: Mathematical induction is a proof technique used to establish the truth of an infinite number of statements, typically involving integers.

- *Two Main Components*:
- *Base Case*: Show that the statement holds for the initial value (usually n=1).
- *Inductive Step*: Assume the statement is true for some arbitrary integer k (inductive hypothesis) and then prove it must also be true for k+1.

- *Structure of Induction*:
1. *Base Case* (n=1): Verify the assertion is correct for the first integer.
2. *Inductive Hypothesis*: Assume the assertion holds for n=k.
3. *Inductive Step*: Prove the assertion holds for n=k+1 using the inductive hypothesis.

- *[[id:95edc4bc-c364-4b18-833a-ba476b3283e8][Recursive]] Structures*: Induction is closely related to recursion in computer science, where a problem is solved by reducing it to smaller instances of the same problem.
4 changes: 4 additions & 0 deletions Content/20250122095143-dsindex.org
Original file line number Diff line number Diff line change
Expand Up @@ -10,4 +10,8 @@ Index into data structures of varying complexities and brief descriptions about
| Data Structure | Desc |
|----------------+----------------------------|
| [[id:3821a4f5-998a-4903-970f-d95bf2ed8cd4][Binary Tree]] | forkable linked lists |
| [[id:1d703f5b-8b5e-4c82-9393-a2c88294c959][Graphs]] | Edges and Nodes |
| [[id:20240519T201001.324666][Merkle Tree]] | scalable data verification |
| [[id:4054ebe7-812f-4204-be63-bb7379d5b56b][Queue]] | FIFO/LILO |
| [[id:e20be945-5df7-4b35-aab5-a9a439b62de8][Stack]] | FILO/LIFO |
| [[id:9a7e1b83-9160-40a7-821b-0f0ada44e350][Linked lists]] | Fundamental data and glue |
1 change: 0 additions & 1 deletion Content/20250124104046-design_patterns.org
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,5 @@

Generic Patterns, or their compositions, that can be thrown at problems before you have to start inventing novel ones.


* Resources
- https://refactoring.guru/design-patterns
16 changes: 15 additions & 1 deletion Content/20250124131606-master_theorem.org
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,21 @@
#+title: Master Theorem
#+filetags: :cs:algo:

See [[id:8e9f6cef-da57-48ed-b86d-029f1b528615][Time Complexity]]

* Overview

- *Definition*: The Master Theorem provides a method for analyzing the [[id:8e9f6cef-da57-48ed-b86d-029f1b528615][time complexity]] of [[id:95edc4bc-c364-4b18-833a-ba476b3283e8][recursive]] algorithms.
- *Form*: The theorem applies to recurrences of the form T(n) = aT(n/b) + f(n), where:
- T(n) is the time complexity,
- a ≥ 1 is the number of subproblems,
- b > 1 is the factor by which the problem size is reduced,
- f(n) is a function that describes the cost of dividing the problem and combining the results.
- *Applications*: Often used in the analysis of [[id:60121a6c-9dd8-4a17-8a87-15e8147ab228][divide-and-conquer]] algorithms, such as mergesort and quicksort.

*Cases of the Master Theorem*:
1. *Case 1*: If f(n) is polynomially smaller than n^(log_b(a)), then T(n) = Θ(n^(log_b(a))).
2. *Case 2*: If f(n) is asymptotically the same as n^(log_b(a)), then T(n) = Θ(n^(log_b(a)) log(n)).
3. *Case 3*: If f(n) is polynomially larger than n^(log_b(a)), and the regularity condition holds, then T(n) = Θ(f(n)).

* Resources
- https://en.wikipedia.org/wiki/Master_theorem_(analysis_of_algorithms)
62 changes: 62 additions & 0 deletions Content/20250125151342-insertion_sort.org
Original file line number Diff line number Diff line change
Expand Up @@ -4,3 +4,65 @@
#+title: Insertion Sort
#+filetags: :algo:cs:


#+begin_src go :exports both
package main

import (
"fmt"
)

func insertionSort(arr []int) []int {
for i:=1; i < len(arr); i++ {
key := arr[i]
j := i - 1
for j >= 0 && arr[j] > key {
arr[j+1] = arr[j]
j--
}
arr[j+1] = key
}
return arr
}


func main(){
arr := []int{4, 2, 5, 1, -3, -2}
sortedArr := insertionSort(arr)
fmt.Println(sortedArr)
}
#+end_src

#+RESULTS:
: [-3 -2 1 2 4 5]

* Overview
- *Complexity*:
- *Time Complexity*:
- Best Case: \( O(n) \) (when the array is already sorted)
- Average Case: \( O(n^2) \)
- Worst Case: \( O(n^2) \) (when the array is sorted in reverse order)
- *Space Complexity*: \( O(1) \) (in-place sorting)

- *Stability*: Insertion Sort is a [[id:00d20a5b-be5e-44a4-a95f-44690883418d][stable sorting]] algorithm.

- *Use Cases*:
- Suitable for small datasets.
- Efficient for data that is already partially sorted.
- Often used in practice for small data sets or as a part of more complex algorithms.

*** Connections and Observations

- *Real-World Application*:
- Often used in hybrid sorting algorithms (like [[id:7c0d5d3c-50a3-4ed1-ad97-7b3cde2462bc][Timsort]]) where it is combined with other algorithms for better performance on small subarrays.

*** Further Inquiry

To expand understanding of Insertion Sort and its applications, consider the following questions:

1. What are the advantages and disadvantages of using Insertion Sort compared to algorithms like Merge Sort or Quick Sort in various contexts?
2. In what specific scenarios or types of datasets would Insertion Sort prove to be the most efficient?
3. How do variations of Insertion Sort, like Binary Insertion Sort, improve sorting times?
4. How does the stability of Insertion Sort affect its behavior with complex data structures (e.g., sorting objects based on multiple attributes)?

Exploring these questions could deepen your understanding of sorting algorithms and their appropriate applications.
8 changes: 8 additions & 0 deletions Content/20250125160140-stable_sorting.org
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
:PROPERTIES:
:ID: 00d20a5b-be5e-44a4-a95f-44690883418d
:END:
#+title: Stable Sorting
#+filetags: :algo:cs:

* Resources
- https://www.baeldung.com/cs/stable-sorting-algorithms
46 changes: 46 additions & 0 deletions Content/20250125160312-timsort.org
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
:PROPERTIES:
:ID: 7c0d5d3c-50a3-4ed1-ad97-7b3cde2462bc
:END:
#+title: Timsort
#+filetags: :algo:cs:

* Overview

- *Definition*: Timsort is a hybrid sorting algorithm derived from [[id:fa43a1e8-2bee-47d3-98c6-6037a9b0f8ee][merge sort]] and [[id:c70dbfb7-1556-47d6-a590-c438e9662d91][insertion sort]].
- *Origin*: Developed by Tim Peters in 2002 for use in the [[id:985a470b-7184-4f9f-8b16-fe7b90bccebe][Python]] programming language.
- *Stability*: Timsort is a [[id:00d20a5b-be5e-44a4-a95f-44690883418d][stable sort]], meaning it maintains the relative order of items that compare equal.
- *Complexity*:
- Best Case Time Complexity: O(n) when the input list is already sorted.
- Average Case Time Complexity: O(n log n).
- Worst Case Time Complexity: O(n log n).
- *Space Complexity*: O(n) due to temporary arrays created during the merge process.
- *Use Cases*: Suitable for real-world data which often contains ordered sequences (runs), such as sorting large datasets in Python or [[id:b056e747-dee4-4e6d-a7af-d644f842f0b8][Java]].
- *Adaptivity*: Timsort takes advantage of existing order in data by identifying runs (consecutive ordered sequences).

*** Connections
- *Hybrid Nature*: Combines strengths of insertion sort (efficient for small datasets or partially sorted data) and merge sort (efficient for larger datasets).
- *Stability Significance*: Important in applications where the original order of equal elements is needed (e.g., in database records).
- *Practical Performance*: Despite theoretical complexities, Timsort performs remarkably well on various datasets due to its adaptivity.

* Mechanism of TimSort

- Step 1: Identify Runs:
- Timsort begins by partitioning the input array into small segments known as runs, which are either ordered ascending or descending sequences.
- It uses a minimum run size, typically between 32 and 64 elements, to ensure efficient processing.

- Step 2: Sort Each Run:
- Each run is sorted using insertion sort, which is well-suited for small datasets due to its low overhead.

- Step 3: Merge Runs:
- Sorted runs are merged together using a modified merge sort algorithm.
- The merging process takes care to maintain the stability of the sorting and uses a [[id:e20be945-5df7-4b35-aab5-a9a439b62de8][stack]] to keep track of runs.

- Step 4: Manage Stack of Runs:
- Runs are pushed onto a stack, and based on certain size constraints, Timsort merges those runs to ensure that the overall sorting remains efficient.
- The merging strategy uses the principles of maintaining balanced merges, similar to a binary tree structure.

- Step 5: Final Merge:
- The process continues until all runs are merged into a final sorted array.

* Resources
- https://en.wikipedia.org/wiki/Timsort
84 changes: 84 additions & 0 deletions Content/20250125160410-merge_sort.org
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
:PROPERTIES:
:ID: fa43a1e8-2bee-47d3-98c6-6037a9b0f8ee
:END:
#+title: Merge Sort
#+filetags: :algo:cs:

#+begin_src go :exports both
package main

import (
"fmt"
)

func merge(a1 []int, a2 []int) []int {
n, m := len(a1), len(a2)
res := make([]int, n+m)
i, j, k := 0, 0, 0
for i < n || j < m {
if j == m || (i < n && a1[i] < a2[j]) {
res[k] = a1[i]
k++
i++
} else {
res[k] = a2[j]
k++
j++
}
}
return res
}

func mergeSort(arr []int) []int {
if len(arr) < 2 {
return arr
}
mid := len(arr) / 2
return merge(mergeSort(arr[:mid]), mergeSort(arr[mid:]))
}

func main() {
arr := []int{4, 2, 5, 1, -3, -2}
sortedArr := mergeSort(arr)
fmt.Println(sortedArr)
}
#+end_src

#+RESULTS:
: [-3 -2 1 2 4 5]

* Overview

- *Definition*: Merge sort is a [[id:60121a6c-9dd8-4a17-8a87-15e8147ab228][divide-and-conquer]] algorithm that sorts an array by [[id:95edc4bc-c364-4b18-833a-ba476b3283e8][recursively]] dividing it into two halves, sorting each half, and then merging the sorted halves.

*** Key Features:
- *Time Complexity*: O(n log n) on average, best, and worst cases.
- *Space Complexity*: O(n) due to the use of temporary arrays for merging.
- *Stability*: It is a [[id:00d20a5b-be5e-44a4-a95f-44690883418d][stable sort]]; equal elements maintain their relative order.
- *Adaptivity*: Not adaptive; it processes the same regardless of input order.

*** Steps of the Algorithm:
1. *Recursive Division*:
- Split the input array into two halves.
- Continue splitting until each sub-array has at most one element (base case).

2. *Merging*:
- Combine the sorted halves back together in a sorted manner by comparing the elements of both halves.

3. *Base Case*:
- An array of length 0 or 1 is already sorted.

*** Misc
- Merge sort is particularly effective for:
- Sorting [[id:9a7e1b83-9160-40a7-821b-0f0ada44e350][linked lists]].
- Large datasets that do not fit into memory, as it can be adapted to external sorting.

*** Questions for Clarification:
- What specific aspects of merge sort do you wish to explore further (e.g., applications, comparisons, optimizations)?
- Are you interested in examples of merge sort implementations in other programming languages?

*** Pathways for Further Research:
- How does merge sort compare with non-comparison-based sorting algorithms (e.g., radix sort, counting sort)?
- What are the practical applications of merge sort in real-world systems and data processing?
- Explore the theoretical limits of sorting algorithms: What are the proven lower bounds for comparison-based sorting?
- Investigate optimizations of merge sort, particularly in adaptive contexts or with parallel processing techniques.
5 changes: 5 additions & 0 deletions Content/20250125160547-java.org
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
:PROPERTIES:
:ID: b056e747-dee4-4e6d-a7af-d644f842f0b8
:END:
#+title: Java
#+filetags: :java:programming:
5 changes: 5 additions & 0 deletions Content/20250125160850-stack.org
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
:PROPERTIES:
:ID: e20be945-5df7-4b35-aab5-a9a439b62de8
:END:
#+title: Stack
#+filetags: :data:cs:
5 changes: 5 additions & 0 deletions Content/20250125160947-queue.org
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
:PROPERTIES:
:ID: 4054ebe7-812f-4204-be63-bb7379d5b56b
:END:
#+title: Queue
#+filetags: :data:cs:
5 changes: 5 additions & 0 deletions Content/20250126171754-linked_lists.org
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
:PROPERTIES:
:ID: 9a7e1b83-9160-40a7-821b-0f0ada44e350
:END:
#+title: Linked lists
#+filetags: :data:cs:

0 comments on commit 853407e

Please sign in to comment.