Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reduce typo count. #42

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion old_code/lox-macros/src/derives/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ use crate::util::struct_fields;


// /// Specifies the default casting mode when the casting mode is not explicitly
// /// specifid.
// /// specified.
// const DEFAULT_CAST_MODE: input::CastMode = input::CastMode::Lossy;

// /// Specifies whether casting colors (in "rounding" mode) is allowed when a
Expand Down
4 changes: 2 additions & 2 deletions old_code/loxi/src/args.rs
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ pub struct ConvertArgs {
pub target_format: Option<FileFormat>,

/// Specify the target file encoding. Valid values: 'binary' (native
/// endianess), 'bbe' (binary big endian), 'ble' (binary little endian) and
/// endianness), 'bbe' (binary big endian), 'ble' (binary little endian) and
/// 'ascii'.
#[structopt(
short = "-e",
Expand Down Expand Up @@ -147,7 +147,7 @@ pub struct InfoArgs {
// TODO:
// - different output types (short, json, ...)
// - output template?
// - include/exclude specific informations
// - include/exclude specific information
}

fn parse_file_format(src: &str) -> Result<FileFormat, String> {
Expand Down
8 changes: 4 additions & 4 deletions old_code/loxi/src/commands/convert.rs
Original file line number Diff line number Diff line change
Expand Up @@ -114,8 +114,8 @@ pub fn run(global_args: &GlobalArgs, args: &ConvertArgs) -> Result<(), Error> {
println!();
}

// Check target compatability
check_compatability(&mesh, target_format);
// Check target compatibility
check_compatibility(&mesh, target_format);

// Write file
let before_write = Instant::now();
Expand Down Expand Up @@ -164,7 +164,7 @@ fn write_file(
) -> Result<(), Error> {
info!("Target format: {} ({} encoding)", format, encoding);

// We can `unwrap()` here becaue we know the encoding is compatible with
// We can `unwrap()` here because we know the encoding is compatible with
// the format.
let file = BufWriter::new(File::create(&args.target)?);
let writer = format.writer_with_encoding(encoding, file).unwrap();
Expand All @@ -176,7 +176,7 @@ fn write_file(
Ok(())
}

fn check_compatability(
fn check_compatibility(
mesh: &AnyMesh,
format: FileFormat,
) {
Expand Down
2 changes: 1 addition & 1 deletion old_code/loxi/src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ use crate::{
/// useful code is in `run()`.
fn main() {
if let Err(e) = run() {
error!("An error occured: {}", e);
error!("An error occurred: {}", e);

for cause in e.iter_causes() {
error!(" ... caused by: {}", cause);
Expand Down
4 changes: 2 additions & 2 deletions old_code/loxi/src/util.rs
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ pub enum EncodingRequest {
/// Specific encoding: native endian binary
BinaryNative,

/// Broad request: some binary encoding, native endianess is preferred
/// Broad request: some binary encoding, native endianness is preferred
Binary,
}

Expand All @@ -41,7 +41,7 @@ impl FromStr for EncodingRequest {
"ble" => Ok(EncodingRequest::Specific(FileEncoding::BinaryLittleEndian)),
"ascii" => Ok(EncodingRequest::Specific(FileEncoding::Ascii)),
other => Err(format!(
"'{}' is not a valid file encoding (possible values: 'binary' (endianess not \
"'{}' is not a valid file encoding (possible values: 'binary' (endianness not \
specified, but prefer native), 'bne' (binary native endian), \
'bbe' (binary big endian), 'ble' (binary little endian) and 'ascii')",
other,
Expand Down
2 changes: 1 addition & 1 deletion old_code/src/core/face_delegate.rs
Original file line number Diff line number Diff line change
Expand Up @@ -194,7 +194,7 @@ impl FaceDelegateMesh {
// So what do we have to do?
//
// For one, we need to change the `next_face` of the face prior
// to us in the clockwise cycle. There are two possibilties:
// to us in the clockwise cycle. There are two possibilities:
// either the new face is inserted at the end (CW speaking) of
// an already existing fan-blade, meaning that the last face of
// a fan-blade is adjacent to the new face. Or this is not the
Expand Down
22 changes: 11 additions & 11 deletions old_code/src/io/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -514,8 +514,8 @@ impl FileFormat {
/// already specify the types on this method. But we do get a significant
/// speed advantage. See [`DynStreamSink`] for more information.
///
/// The encoding is choosen depending on what the format supports. Native
/// binary encoding is preferred, followed by swapped-endianess binary,
/// The encoding is chosen depending on what the format supports. Native
/// binary encoding is preferred, followed by swapped-endianness binary,
/// followed by ASCII encoding. If you need to specify the encoding, take a
/// look at [`writer_with_encoding`][FileFormat::writer_with_encoding].
pub fn writer<'a, SrcT, W>(&self, w: W) -> Box<dyn DynStreamSink<SrcT> + 'a>
Expand Down Expand Up @@ -616,7 +616,7 @@ pub enum FileEncoding {
}

impl FileEncoding {
/// Returns the binary encoding with native endianess (e.g.
/// Returns the binary encoding with native endianness (e.g.
/// `BinaryLittleEndian` on x86).
pub fn binary_native() -> Self {
#[cfg(target_endian = "big")]
Expand Down Expand Up @@ -778,7 +778,7 @@ pub enum ErrorKind {
///
/// If you encounter this error, here is what you can do: make sure your
/// input file is well-formed. If you are sure that your file is fine and
/// other programs can succesfully parse that file, please consider
/// other programs can successfully parse that file, please consider
/// reporting this as a parser bug.
Parse(ParseError),

Expand All @@ -789,18 +789,18 @@ pub enum ErrorKind {
/// this `InvalidInput` rather represents logical errors in the file (like
/// faces not defining their vertices or wrong order of elements).
/// Furthermore, parse errors can usually point to the exact part of the
/// file where the error occured. These general input errors are more
/// file where the error occurred. These general input errors are more
/// abstract and often don't just belong to one specific span.
///
/// If you encounter this error, here is what you can do: make sure your
/// input file is well-formed. If you are sure that your file is fine and
/// other programs can succesfully parse that file, please consider
/// other programs can successfully parse that file, please consider
/// reporting this as a parser bug.
InvalidInput(String),

/// The sink is somehow unable to store the incoming data.
///
/// This might have a variety of differet causes. For example, some file
/// This might have a variety of different causes. For example, some file
/// formats only support 32 bit indices for elements, meaning that
/// attempting to store a mesh with more than 2<sup>32</sup> elements with
/// that format would fail with this error.
Expand Down Expand Up @@ -1064,7 +1064,7 @@ pub trait Primitive: PrimitiveNum + Sealed {
const TY: PrimitiveType;

/// Returns the channel type represented at runtime by
/// [`PrimitiveColorChannelType`] for `Primitive` types thare are also a
/// [`PrimitiveColorChannelType`] for `Primitive` types that are also a
/// [`PrimitiveColorChannel`].
fn channel_type() -> PrimitiveColorChannelType
where
Expand Down Expand Up @@ -1207,7 +1207,7 @@ pub trait StreamSource {
/// not object-safe (i.e. cannot be made into a trait-object). This is OK for
/// most uses, but sometimes a dynamically dispatched source is necessary.
/// That's what this trait is for. It moves the generic `SinkT` parameter from
/// the method to the trait to make it possible ot have a
/// the method to the trait to make it possible to have a
/// `dyn DynStreamSink<MySource>`.
///
/// For more information, see [`DynStreamSink`] which works exactly like this
Expand Down Expand Up @@ -1496,7 +1496,7 @@ pub trait StreamSink {
/// object-safe (i.e. cannot be made into a trait-object). This is OK for most
/// uses, but sometimes a dynamically dispatched sink is necessary. That's what
/// this trait is for. It moves the generic `SrcT` parameter from the method to
/// the trait to make it possible ot have a `dyn DynStreamSink<MySource>`.
/// the trait to make it possible to have a `dyn DynStreamSink<MySource>`.
///
/// Having the source type as a trait parameter does restrict the potential
/// usages of this trait. In other words: you either have to know the type of
Expand Down Expand Up @@ -1564,7 +1564,7 @@ where
/// The handles passed to all main property methods must be valid handles
/// obtained from the mesh returned by `core_mesh()`.
///
/// All property methods have a default implemention which returns `None` in
/// All property methods have a default implementation which returns `None` in
/// `*_type` and panics in `*`.
///
///
Expand Down
6 changes: 3 additions & 3 deletions old_code/src/io/ply/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
//! positions are stored in three properties called `x`, `y` and `z`.
//!
//! PLY files can be encoded as ASCII or as binary with either big or small
//! endianess. While the ASCII encoding is space-inefficient (as usual), the
//! endianness. While the ASCII encoding is space-inefficient (as usual), the
//! binary formats are very close to be memory-optimal. The only very minor
//! waste is that the number of vertices per face is always stored -- which is
//! not necessary in the triangle-only case. But this only wastes one byte per
Expand Down Expand Up @@ -67,7 +67,7 @@ pub use self::write::{Config, Writer};

// ----------------------------------------------------------------------------

/// File name extentions used for this file format: `.ply`.
/// File name extensions used for this file format: `.ply`.
pub(super) const FILE_EXTENSIONS: &[&str] = &["ply"];

/// Check if the given data from the start of the file is a valid PLY file
Expand Down Expand Up @@ -98,7 +98,7 @@ pub enum Encoding {
}

impl Encoding {
/// Returns the binary encoding with native endianess (little endian on
/// Returns the binary encoding with native endianness (little endian on
/// x86).
pub fn binary_native() -> Self {
if cfg!(target_endian = "big") {
Expand Down
12 changes: 6 additions & 6 deletions old_code/src/io/ply/raw.rs
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ pub trait Serializer {
/// Encode a slice of values into the serializer. If you have multiple
/// values of the same type, use this function to improve performance. The
/// slice is a mutable slice because the method implementation might want
/// to mutate it without creating a copy (e.g. changing endianess).
/// to mutate it without creating a copy (e.g. changing endianness).
fn add_slice<P: PlyScalar>(&mut self, s: &mut [P]) -> Result<(), Error> {
for x in s {
self.add(*x)?;
Expand Down Expand Up @@ -346,7 +346,7 @@ impl fmt::Debug for ScalarTypeParseError {
/// same for all elements of one element group.
#[derive(Clone)]
pub struct RawElement {
/// The packed data of all properties in native endianess.
/// The packed data of all properties in native endianness.
pub data: RawData,

/// Some meta information about each property in this element.
Expand Down Expand Up @@ -600,10 +600,10 @@ impl fmt::Debug for PropIndex {
}
}

/// Raw data of one element in native endianess. Can be indexed by `RawOffset`.
/// Raw data of one element in native endianness. Can be indexed by `RawOffset`.
///
/// For PLY files stored in native endianess, this is an exact chunk from
/// the file. For ASCII files and files in non-native endianess, the
/// For PLY files stored in native endianness, this is an exact chunk from
/// the file. For ASCII files and files in non-native endianness, the
/// properties are first converted to this format.
#[derive(Debug, Clone, From)]
pub struct RawData(Vec<u8>);
Expand Down Expand Up @@ -745,7 +745,7 @@ impl<T> ops::DerefMut for PropVec<T> {
/// This is a fairly space inefficient representation of properties and it's
/// pretty slow. This should only be used for debugging and testing.
///
/// The sizes of the smallvecs are choosen so that the inline variant won't
/// The sizes of the smallvecs are chosen so that the inline variant won't
/// inflict a size overhead (on x64). This still means that the most common
/// form of list, the three-tuple `vertex_indices`, will fit inline.
#[derive(Clone, PartialEq)]
Expand Down
12 changes: 6 additions & 6 deletions old_code/src/io/ply/read.rs
Original file line number Diff line number Diff line change
Expand Up @@ -226,7 +226,7 @@ impl<R: io::Read> Reader<R> {
() if buf.is_next(b"property ")? => {
let line_start = buf.offset();

// Get last element or error if there wasn't a preceeding
// Get last element or error if there wasn't a preceding
// `element` line.
let elem = elements.last_mut().ok_or_else(|| {
buf.spanned_data(b"property".len())
Expand Down Expand Up @@ -665,7 +665,7 @@ impl IdxLayout for SeparateIdx {
}

/// Something that can provide indices for a specific property. These indices
/// denote the position of the invididual values of the property inside the raw
/// denote the position of the individual values of the property inside the raw
/// element data.
trait IdxProvider<Prop: PropKind> {
fn idx(&self) -> &<Prop::Layout as PropLayout>::Idx;
Expand All @@ -688,7 +688,7 @@ impl_idx_provider!(FaceReadState, NormalProp<T>, normal_idx);
impl_idx_provider!(FaceReadState, RgbaColorProp, color_idx);
impl_idx_provider!(EdgeReadState, RgbaColorProp, color_idx);

// Manual impls for RGB as we want to resuse the 4 element RGBA index.
// Manual impls for RGB as we want to reuse the 4 element RGBA index.
impl IdxProvider<RgbColorProp> for VertexReadState {
fn idx(&self) -> &<<RgbColorProp as PropKind>::Layout as PropLayout>::Idx {
(&self.color_idx[..3]).try_into().unwrap()
Expand Down Expand Up @@ -821,7 +821,7 @@ where
}

fn bug_read_prop<Sink, State>(_: &mut Sink, _: &RawElement, _: &State) {
panic!("bug in PLY `RawTransferSink`: property reader of non-existant property called");
panic!("bug in PLY `RawTransferSink`: property reader of non-existent property called");
}

type VertexPropHandler<S> = fn(&mut S, &RawElement, &VertexReadState);
Expand Down Expand Up @@ -1509,7 +1509,7 @@ fn read_raw_element_group_binary<R: io::Read>(


if byte_swap {
// ----- Swapped endianess -----
// ----- Swapped endianness -----
for _ in 0..element_def.count {
swap_table.truncate(first_list_prop);
let prop_infos = &mut (*elem.prop_infos)[first_list_prop..];
Expand All @@ -1531,7 +1531,7 @@ fn read_raw_element_group_binary<R: io::Read>(
sink.element(&elem)?;
}
} else {
// ----- Native endianess -----
// ----- Native endianness -----
for _ in 0..element_def.count {
let prop_infos = &mut (*elem.prop_infos)[first_list_prop..];
elem.data.clear();
Expand Down
2 changes: 1 addition & 1 deletion old_code/src/io/ply/tests.rs
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ fn test_is_file_start() {
/// This abstract macro takes the name of another macro and invokes it three
/// times, with the identifiers `ascii`, `ble` and `bbe` as first argument.
///
/// It also accepts arbitarily any other args that are forwarded to the
/// It also accepts arbitrarily any other args that are forwarded to the
/// `generator` macro.
macro_rules! gen_for_encodings {
($generator:ident $(, $args:ident)*) => {
Expand Down
2 changes: 1 addition & 1 deletion old_code/src/io/stl/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ pub use self::write::{Config, Writer};

// ----------------------------------------------------------------------------

/// File name extentions used for this file format: `.stl`.
/// File name extensions used for this file format: `.stl`.
pub(super) const FILE_EXTENSIONS: &[&str] = &["stl"];

/// Check if the given data from the start of the file is a valid STL file
Expand Down
4 changes: 2 additions & 2 deletions old_code/src/io/util.rs
Original file line number Diff line number Diff line change
Expand Up @@ -178,7 +178,7 @@ pub trait MemSourceExt {
/// given vertex colors instead of the original.
///
/// This also works if the original source does not offer vertex colors.
/// TODO: make possible to specifiy cast rigor
/// TODO: make possible to specify cast rigor
fn with_vertex_colors<'a, M>(
&'a self,
vertex_colors: &'a M,
Expand Down Expand Up @@ -216,7 +216,7 @@ pub trait MemSourceExt {
/// given face colors instead of the original.
///
/// This also works if the original source does not offer face colors.
/// TODO: make possible to specifiy cast rigor
/// TODO: make possible to specify cast rigor
fn with_face_colors<'a, M>(
&'a self,
face_colors: &'a M,
Expand Down
2 changes: 1 addition & 1 deletion old_code/src/shape/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ use crate::{
};


/// A regular Tetrahedron: pyramide with triangle-base and pointy top, all
/// A regular Tetrahedron: pyramid with triangle-base and pointy top, all
/// sides are equilateral triangles.
#[derive(Debug, Clone, Copy)]
pub struct Tetrahedron {
Expand Down
4 changes: 2 additions & 2 deletions src/algo/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ where

mesh.vertices().map(|v| {
// If the vertex is a boundary vertex, its position doesn't change. If
// not, we use the centroid of all neighbors' position as new positon.
// not, we use the centroid of all neighbors' position as new position.
let new_pos = if v.is_boundary() {
pos_of(v)
} else {
Expand Down Expand Up @@ -66,7 +66,7 @@ where
// and no fucked-up edges), the two are equivalent, because:
// - if (b) => each face has as many edges as vertices. On each edge of the
// face, there can only be one other face. Since there are as many
// adjacent faces as adjacent vertices/eges, each edge has two adjacent
// adjacent faces as adjacent vertices/edges, each edge has two adjacent
// faces.
// - if (a) => if all edges of a face have two adjacent faces, the face has
// as many adjacent faces as edges. Which is also the same number as the
Expand Down
2 changes: 1 addition & 1 deletion src/cast.rs
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
//! a number to fit in a smaller range (e.g. `500u16` to `u8`). "Rounding"
//! means that the input number is inside the range of the destination type,
//! but can not be exactly represented; a reasonable close number of the
//! destiniation type is choosen (e.g. float to int).
//! destiniation type is chosen (e.g. float to int).
//!
//! Two binary choices lead to four different fidelities plus one extra one to
//! disallow any type change:
Expand Down
4 changes: 2 additions & 2 deletions src/core/directed_edge/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,7 @@ impl fmt::Debug for HalfEdgeHandle {
/// This data structure stores information in directed edges which are stored
/// per face (each face has exactly three). Each directed edge stores its twin
/// directed edge and its target vertex. The `next` and `prev` handles to
/// circulate around a face are typically not stored but given implictly by the
/// circulate around a face are typically not stored but given implicitly by the
/// memory location of the directed edge: all three directed edges of a face
/// are stored contiguously in memory.
///
Expand Down Expand Up @@ -159,7 +159,7 @@ pub(crate) struct Vertex {
///
/// The original paper suggest to store -1 to indicate a boundary edge and -2
/// to indicate a non-manifold edge. Since this library does not support
/// non-manifold edges anyway, we don't need the -2 sentinal value. The
/// non-manifold edges anyway, we don't need the -2 sentinel value. The
/// original paper also does not describe a way to iterate along the boundary
/// of a mesh. We can improve this by using this field to store the next
/// boundary edge (kind of).
Expand Down
Loading
Loading