Skip to content

Commit

Permalink
💄 Redesign file explorer and isolate to shared component(s) (#286)
Browse files Browse the repository at this point in the history
* Fix dead docs links

* WIP explorer refactor

* WIP explorer refactor, focus on styles

* 📱 Redesign file explorer and isolate to shared component

Also some bug fixes and optimizations

* Fix clippy lint
  • Loading branch information
aaronleopold authored Feb 27, 2024
1 parent a2a49b7 commit 1d30084
Show file tree
Hide file tree
Showing 43 changed files with 867 additions and 325 deletions.
1 change: 1 addition & 0 deletions .github/actions/build-docker/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,7 @@ runs:
context: .
build-args: |
"GIT_REV=${{ env.GIT_REV }}"
"TAGS=${{ env.TAGS }}"
file: docker/Dockerfile
platforms: ${{ inputs.platforms }}
load: ${{ inputs.load }}
Expand Down
38 changes: 16 additions & 22 deletions apps/expo/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Stump Mobile Application

This is the mobile application for the Stump. It is built with [Expo](https://expo.io/), and is currently in the very early stages of development.
This is the mobile application for the Stump. It is built with [Expo](https://expo.io/) and is currently in the very early stages of development.

## Getting Started 🚀

Expand All @@ -10,9 +10,18 @@ This is the mobile application for the Stump. It is built with [Expo](https://ex
2. Follow the [developer guide](https://github.com/aaronleopold/stump#developer-guide-) at the root of the Stump monorepo
3. Start the mobile app and the server:

```bash
moon run server:start mobile:dev # or server:dev if you want to run the server in dev mode
```
To start the server and mobile app concurrently, you can use the following command:

```bash
yarn dev:expo
```

If you want to start the server and mobile app separately, you can use the following commands in two separate terminals:

```bash
cargo run --package stump_server
yarn workspace @stump/expo start --clear
```

And that's it!

Expand All @@ -29,24 +38,9 @@ Be sure to review the [CONTRIBUTING.md](https://github.com/aaronleopold/stump/tr

## Roadmap 🗺️

With the `v0.1.0` release of the Stump server (very) slowly approaching, the mobile app has the following items targeted for an _initial POC_:

- [ ] Various initial expo-related project configuration:
- [ ] Appropriate routing setup (e.g. tabs and stack navigators)
- [x] Configure and connect to a Stump server instance
- [x] Login or claim an unclaimed server instance
- [ ] A home screen that shows the server at a glance:
- [x] Various server statistics
- [ ] In progress media
- [ ] Newly added series and books
- [ ] A library screen that shows a paginated list of series within a library
- [ ] A series screen that shows a pagination list of media within a series
- [ ] A very basic book overview screen
- [ ] Support barebones readers for:
- [ ] Epub
- [ ] CBZ/CBR
- [ ] Dark theme support
You can find the high-level roadmap for the Stump mobile app in the [documentation](https://www.stumpapp.dev/guides/mobile/app#planned-features). For a more granular view of what is coming, you can also take a look at the [project board](https://github.com/orgs/stumpapp/projects/8).

## Acknowledgements 🙏

- Thanks to [@dancamdev](https://github.com/dancamdev) for bootstrapping this Expo project 🙌
- Thanks to [@dancamdev](https://github.com/dancamdev) for bootstrapping this Expo project template 🙌
- Thanks to [@LRotenberger](https://github.com/LRotenberger) for building out the initial POC for the mobile app 🚀
2 changes: 1 addition & 1 deletion core/src/db/entity/job.rs
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ use super::{Cursor, Log};
pub type ExternalJobOutput = serde_json::Value;

/// An enum which represents the possible outputs of a job in the Stump core
#[derive(Clone, Serialize, Deserialize, Type, ToSchema)]
#[derive(Debug, Clone, Serialize, Deserialize, Type, ToSchema)]
#[serde(untagged)]
pub enum CoreJobOutput {
LibraryScan(LibraryScanOutput),
Expand Down
1 change: 1 addition & 0 deletions core/src/db/entity/media/mod.rs
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
mod annotation;
mod bookmark;
mod entity;
pub(crate) mod prisma_macros;
mod progress;
pub(crate) mod utils;

Expand Down
6 changes: 6 additions & 0 deletions core/src/db/entity/media/prisma_macros.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
use crate::prisma::media;

media::select!(media_path_modified_at_select {
path
modified_at
});
1 change: 1 addition & 0 deletions core/src/db/entity/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ pub use common::{AccessRole, Cursor, EntityVisibility, FileStatus, LayoutMode};

pub mod macros {
pub use super::book_club::prisma_macros::*;
pub use super::media::prisma_macros::*;
pub use super::metadata::prisma_macros::*;
pub use super::smart_list::prisma_macros::*;
}
6 changes: 5 additions & 1 deletion core/src/event.rs
Original file line number Diff line number Diff line change
@@ -1,14 +1,18 @@
use serde::{Deserialize, Serialize};
use specta::Type;

use crate::job::{JobUpdate, WorkerSend, WorkerSendExt};
use crate::{
db::entity::CoreJobOutput,
job::{JobUpdate, WorkerSend, WorkerSendExt},
};

/// An event that is emitted by the core and consumed by a client
#[derive(Clone, Serialize, Deserialize, Debug, Type)]
#[serde(tag = "__typename")]
pub enum CoreEvent {
JobStarted(String),
JobUpdate(JobUpdate),
JobOutput { id: String, output: CoreJobOutput },
DiscoveredMissingLibrary(String),
CreatedMedia { id: String, series_id: String },
CreatedManySeries { count: u64, library_id: String },
Expand Down
17 changes: 13 additions & 4 deletions core/src/filesystem/scanner/library_scan_job.rs
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ use specta::Type;

use crate::{
db::{
entity::{LibraryOptions, Series},
entity::{CoreJobOutput, LibraryOptions, Series},
FileStatus, SeriesDAO, DAO,
},
filesystem::{
Expand Down Expand Up @@ -82,6 +82,9 @@ pub struct LibraryScanOutput {
total_directories: u64,
/// The number of files that were ignored during the scan
ignored_files: u64,
/// The number of files that were deemed to be skipped during the scan, e.g. it
/// exists in the database but has not been modified since the last scan
skipped_files: u64,
/// The number of ignored directories during the scan
ignored_directories: u64,
/// The number of media entities created
Expand All @@ -99,6 +102,7 @@ impl JobOutputExt for LibraryScanOutput {
self.total_files += updated.total_files;
self.total_directories += updated.total_directories;
self.ignored_files += updated.ignored_files;
self.skipped_files += updated.skipped_files;
self.ignored_directories += updated.ignored_directories;
self.created_media += updated.created_media;
self.updated_media += updated.updated_media;
Expand Down Expand Up @@ -216,9 +220,14 @@ impl JobExt for LibraryScanJob {

async fn cleanup(
&self,
_: &WorkerCtx,
ctx: &WorkerCtx,
output: &Self::Output,
) -> Result<Option<Box<dyn Executor>>, JobError> {
ctx.send_core_event(CoreEvent::JobOutput {
id: ctx.job_id.clone(),
output: CoreJobOutput::LibraryScan(output.clone()),
});

let did_create = output.created_series > 0 || output.created_media > 0;
let did_update = output.updated_series > 0 || output.updated_media > 0;
let image_options = self
Expand Down Expand Up @@ -427,6 +436,7 @@ impl JobExt for LibraryScanJob {
missing_media,
seen_files,
ignored_files,
skipped_files,
} = match walk_result {
Ok(walked_series) => walked_series,
Err(core_error) => {
Expand All @@ -446,6 +456,7 @@ impl JobExt for LibraryScanJob {
};
output.total_files += seen_files + ignored_files;
output.ignored_files += ignored_files;
output.skipped_files += skipped_files;

if series_is_missing {
ctx.report_progress(JobProgress::msg("Series not found on disk!"));
Expand Down Expand Up @@ -499,8 +510,6 @@ impl JobExt for LibraryScanJob {
task,
})
.collect();

ctx.report_progress(JobProgress::msg("Series walk complete!"));
},
LibraryScanTask::SeriesTask {
id: series_id,
Expand Down
17 changes: 15 additions & 2 deletions core/src/filesystem/scanner/series_scan_job.rs
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,10 @@ use serde::{Deserialize, Serialize};
use specta::Type;

use crate::{
db::{entity::LibraryOptions, FileStatus},
db::{
entity::{CoreJobOutput, LibraryOptions},
FileStatus,
},
filesystem::image::{ThumbnailGenerationJob, ThumbnailGenerationJobParams},
job::{
error::JobError, Executor, JobExt, JobOutputExt, JobProgress, JobTaskOutput,
Expand Down Expand Up @@ -56,6 +59,9 @@ pub struct SeriesScanOutput {
total_files: u64,
/// The number of files that were ignored during the scan
ignored_files: u64,
/// The number of files that were deemed to be skipped during the scan, e.g. it
/// exists in the database but has not been modified since the last scan
skipped_files: u64,
/// The number of media entities that were created
created_media: u64,
/// The number of media entities that were updated
Expand All @@ -66,6 +72,7 @@ impl JobOutputExt for SeriesScanOutput {
fn update(&mut self, updated: Self) {
self.total_files += updated.total_files;
self.ignored_files += updated.ignored_files;
self.skipped_files += updated.skipped_files;
self.created_media += updated.created_media;
self.updated_media += updated.updated_media;
}
Expand Down Expand Up @@ -114,6 +121,7 @@ impl JobExt for SeriesScanJob {
missing_media,
seen_files,
ignored_files,
skipped_files,
} = walk_series(
PathBuf::from(self.path.clone()).as_path(),
WalkerCtx {
Expand All @@ -138,6 +146,7 @@ impl JobExt for SeriesScanJob {
);
output.total_files = seen_files + ignored_files;
output.ignored_files = ignored_files;
output.skipped_files = skipped_files;

let tasks = VecDeque::from(chain_optional_iter(
[],
Expand All @@ -164,9 +173,13 @@ impl JobExt for SeriesScanJob {

async fn cleanup(
&self,
_: &WorkerCtx,
ctx: &WorkerCtx,
output: &Self::Output,
) -> Result<Option<Box<dyn Executor>>, JobError> {
ctx.send_core_event(CoreEvent::JobOutput {
id: ctx.job_id.clone(),
output: CoreJobOutput::SeriesScan(output.clone()),
});
let did_create = output.created_media > 0;
let did_update = output.updated_media > 0;
let image_options = self
Expand Down
90 changes: 42 additions & 48 deletions core/src/filesystem/scanner/utils.rs
Original file line number Diff line number Diff line change
@@ -1,12 +1,6 @@
use std::{
collections::VecDeque,
fs::File,
io::{BufRead, BufReader},
path::PathBuf,
};
use std::{collections::VecDeque, path::PathBuf};

use globset::{Glob, GlobSet, GlobSetBuilder};
use itertools::Itertools;
use globset::{GlobSet, GlobSetBuilder};
use prisma_client_rust::{
chrono::{DateTime, Utc},
QueryError,
Expand Down Expand Up @@ -59,46 +53,46 @@ pub(crate) fn file_updated_since_scan(
}
}

// TODO: should probably return result as to not scan files which the user would like to ignore
pub(crate) fn generate_rule_set(paths: &[PathBuf]) -> GlobSet {
let mut builder = GlobSetBuilder::new();

let adjusted_paths = paths
.iter()
// We have to remove duplicates here otherwise the glob will double some patterns.
// An example would be when the library has media in root. Not the end of the world.
.unique()
.filter(|p| p.exists())
.collect::<Vec<_>>();

tracing::trace!(?adjusted_paths, "Adjusted paths");

for path in adjusted_paths {
let ignore_file = path.join(".stumpignore");
let open_result = File::open(&ignore_file);
if let Ok(file) = open_result {
// read the lines of the file, and add each line as a glob pattern in the builder
for line in BufReader::new(file).lines() {
if let Err(e) = line {
tracing::error!(
?ignore_file,
error = ?e,
"Error occurred trying to read line from glob file",
);
continue;
}

// TODO: remove unwraps!
builder.add(Glob::new(&line.unwrap()).unwrap());
}
} else {
tracing::warn!(
error = ?open_result.err(),
?ignore_file,
"Failed to open .stumpignore file (does it exist?)",
);
}
}
// TODO: refactor for https://github.com/stumpapp/stump/issues/284
pub(crate) fn generate_rule_set(_: &[PathBuf]) -> GlobSet {
let builder = GlobSetBuilder::new();

// let adjusted_paths = paths
// .iter()
// // We have to remove duplicates here otherwise the glob will double some patterns.
// // An example would be when the library has media in root. Not the end of the world.
// .unique()
// .filter(|p| p.exists())
// .collect::<Vec<_>>();

// tracing::trace!(?adjusted_paths, "Adjusted paths");

// for path in adjusted_paths {
// let ignore_file = path.join(".stumpignore");
// let open_result = File::open(&ignore_file);
// if let Ok(file) = open_result {
// // read the lines of the file, and add each line as a glob pattern in the builder
// for line in BufReader::new(file).lines() {
// if let Err(e) = line {
// tracing::error!(
// ?ignore_file,
// error = ?e,
// "Error occurred trying to read line from glob file",
// );
// continue;
// }

// // TODO: remove unwraps!
// builder.add(Glob::new(&line.unwrap()).unwrap());
// }
// } else {
// tracing::warn!(
// error = ?open_result.err(),
// ?ignore_file,
// "Failed to open .stumpignore file (does it exist?)",
// );
// }
// }

builder.build().unwrap_or_default()
}
Expand Down
Loading

0 comments on commit 1d30084

Please sign in to comment.