Blog

  • compose_architecture

    compose_architecture

    展示 jetpack compose中使用 mvvm mvi redux 各框架

    compose redux

    添加依赖

    增加maven仓库

    maven { url "https://raw.githubusercontent.com/feiyin0719/compose_architecture/main" }
     implementation 'com.iffly:redux:0.0.3'

    示例代码

    创建state和action类,继承Reducer创建reducer类

    data class CountAction(val type: CountActionType, val data: Int) {
        enum class CountActionType {
            Add, Reduce
        }
    
        companion object {
    
            infix fun addWith(data: Int): CountAction {
                return CountAction(CountActionType.Add, data = data)
            }
    
            infix fun reduceWith(data: Int): CountAction {
                return CountAction(CountActionType.Reduce, data = data)
            }
        }
    }
    
    data class CountState(val count: Int = 1) {
        val doubleCount: Int get() = count * 2
    }
    
    class CountReducer :
        Reducer<CountState, CountAction>(CountState::class.java, CountAction::class.java) {
        override fun reduce(
            countState: CountState,
            flow: Flow<CountAction>
        ): Flow<CountState> {
            return flow.flowOn(Dispatchers.IO).flatMapConcat { action ->
                flow {
                    if (action.type == CountAction.CountActionType.Add)
                        emit(countState.copy(count = countState.count + action.data))
                    else
                        emit(countState.copy(count = countState.count - action.data))
                }
            }.flowOn(Dispatchers.IO)
        }
    }

    state和action使用data class 可以方便创建,同时可以基于koltin提供的copy方法快速更新state

    state类需要提供无参初始化函数,以用于当作初始状态

    reducer类 reduce方法传入当前state和action的流,可以方便的进行异步操作,最后返回state flow流即可

    在app初始化时创建store,传入reducer list,这里实现js redux略有不同

    val s =
            storeViewModel(
                listOf(CountReducer(), CountFloatReducer())
            )
    

    因为storeViewModel是绑定activity生命周期的,所以需要获取store时候只需要调用storeViewModel即可获取,然后通过getState获取需要的state即可,操作时候通过dispatch发送对应action即可

        val s = storeViewModel()
        val state: CountState by s.getState(CountState::class.java)
            .observeAsState(CountState(1))
        Content2(count = state.count) {
            s.dispatch(CountAction reduceWith 1)
        }
            

    middleware

    redux同时还提供中间件,可以更加丰富redux能力,middleware自定义也比较方便

    比如我们实现一个redux-thunk,只需要继承Middleware

    class FunctionActionMiddleWare : MiddleWare {
    
        fun interface FunctionAction {
            suspend operator fun invoke(dispatchAction: StoreDispatch, state: StoreState): Any?
        }
    
        override suspend fun invoke(store: StoreViewModel): (MiddleWareDispatch) -> MiddleWareDispatch {
            return { next ->
                MiddleWareDispatch { action ->
                    if (action is FunctionAction)
                        action(store, store)
                    else {
                        next.dispatchAction(action = action)
                    }
                }
            }
        }
    }
    

    然后在创建store时候传入middleware即可

     val s =
            storeViewModel(
                listOf(CountReducer(), CountFloatReducer()),
                listOf(FunctionActionMiddleWare())
            )

    这样我们就可以dispatch一个 函数来作为action,以此来丰富dispatch能力,同时还可以返回值

    val i = s.dispatch(FunctionActionMiddleWare.FunctionAction { storeDispatch: StoreDispatch, _: StoreState ->
                storeDispatch.dispatch(CountAction addWith 1)
                storeDispatch.dispatch(CountAction addWith 1)
                1
            })
    

    依赖状态

    我们开发中会存在这么一类状态,它是依赖于其他一个或者多个状态来变化的,我们提供了 depState方法来处理这种状态

    我们只需要提前定义好依赖的状态和转换方法

    data class DepState2(val depCount: Int = 0) {
    
        companion object {
            fun transform(countState: CountState, countFloatState: CountFloatState): DepState2 {
                return DepState2((countState.count + (countFloatState.count)).toInt())
            }
        }
    }

    依赖的状态都是定义在transform的参数中,我们通过depState方法创建依赖状态

     s.depState(DepState2::transform)

    然后就可以普通状态一样通过getState方法获取使用

    具体更多使用方法参考源码

    实现原理

    redux基于jetpack的viewmodel livedata 以及kotlin协程 flow实现,即通过绑定activity的生命周期创建一个全局存在的store,store就是ViewModel,然后通过store保存状态和reducer,reducer处理过程是基于flow 进行处理,可以方便进行异步操作,state基于livedata来保存

    Visit original content creator repository
    https://github.com/feiyin0719/compose_architecture

  • mandelbrot_kokkos_cmake

    Visit original content creator repository
    https://github.com/tpadioleau/mandelbrot_kokkos_cmake

  • ttn-ulm-muecke

    Muecke

    A server that subscribes to many MQTT streams (or applications) on the TTN network
    and saves the (sensor) data into a MySQL or InfluxDB database. If enabled, it can
    redirect the sensor data to another HTTP API, example included.

    Supports hot reloading. When an app was added or removed from the database, the service
    will subscribe to or unsubscribe from this MQTT stream.

    Main dependencies

    • paho-mqtt
    • PyMySQL
    • influxdb
    • requests

    It’s only tested with Python 3. Could work on Python 2, but I don’t know 🙂

    Install

    • Install MySQL/MariaDB and/or InfluxDB, according to the docs of these services.
    • For MySQL: Run the ‘CREATE TABLE’ commands in schema/database.sql
    • For InfluxDB: Just install and create a database. Credentials and database name can be set in config.py.
    • Edit the config and insert your database credentials

    Usage

    Create a virtual environment, enter it and install all dependencies:

    virtualenv <custom-path-to-virtual-env>
    source <custom-path-to-virtual-env>/bin/activate
    

    Install all dependencies:

    pip install -r requirements.txt
    

    Run the server:

    python3 server.py
    

    …or create a systemd service for a more stable service.

    License

    AGPL 3.0

    See agpl-3.0.txt.

    Visit original content creator repository
    https://github.com/temporaerhaus/ttn-ulm-muecke

  • bilidown

    Bilidown

    GitHub Release

    哔哩哔哩视频解析下载工具,支持 8K 视频、Hi-Res 音频、杜比视界下载、批量解析,可扫码登录,常驻托盘。

    支持解析的链接类型

    使用说明

    1. Releases 下载适合您系统版本的安装包
    2. 非 Windows 系统,请先安装 FFmpeg 工具
    3. 将安装包解压后执行即可

    软件特色

    1. 前端采用 BootstrapVanJS 构建,轻量美观
    2. 后端使用 Go 语言开发,数据库采用 SQlite,简化构建和部署过程
    3. 前端通过 p-queue 控制并发请求,加快批量解析速度

    其他说明

    • 本程序不支持也不建议 HTTP 代理,直接使用国内网络访问能提升批量解析的成功率和稳定性。

    打包可执行文件

    git clone https://github.com/iuroc/bilidown
    cd bilidown/client
    pnpm install
    pnpm build
    cd ../server
    go mod tidy
    CGO_ENABLED=1 go build

    交叉编译

    说明

    • 镜像名称:iuroc/cgo-cross-build
    • 支持的系统架构
      • linux/amd64
      • windows/amd64
      • windows/386
      • windows/arm64
      • darwin/amd64
      • darwin/arm64

    拉取镜像和项目源码

    docker pull iuroc/cgo-cross-build:latest
    git clone https://github.com/iuroc/bilidown

    交叉编译发行版

    • 执行 goreleaser 命令时将自动执行 pnpm buildgo mod tidy
    cd bilidown/server
    # [交叉编译 Releases]
    docker run --rm -v .:/usr/src/data iuroc/cgo-cross-build goreleaser release --snapshot --clean
    
    # [交互式终端]
    cd bilidown
    docker run --rm -it -v .:/usr/src/data iuroc/cgo-cross-build

    编译指定系统架构

    cd bilidown/server
    
    # [DEFAULT: linux-amd64]
    docker run --rm -v .:/usr/src/data iuroc/cgo-cross-build go build -o dist/bilidown-linux-amd64/bilidown
    
    # [darwin-amd64]
    docker run --rm -v .:/usr/src/data -e GOOS=darwin -e GOARCH=amd64 -e CC=o64-clang -e CGO_ENABLED=1 iuroc/cgo-cross-build go build -o dist/bilidown-darwin-amd64/bilidown

    非 Docker 环境编译

    在 Linux amd64 平台上执行 go build 时,您可能需要安装以下依赖包:

    sudo apt install pkg-config gcc libayatana-appindicator3-dev

    开发环境

    # client
    pnpm install
    pnpm dev
    # server
    go build && ./bilidown

    特别感谢

    软件界面

    Star History

    Star History Chart

    Visit original content creator repository https://github.com/iuroc/bilidown
  • bilidown

    Bilidown

    GitHub Release

    哔哩哔哩视频解析下载工具,支持 8K 视频、Hi-Res 音频、杜比视界下载、批量解析,可扫码登录,常驻托盘。

    支持解析的链接类型

    使用说明

    1. Releases 下载适合您系统版本的安装包
    2. 非 Windows 系统,请先安装 FFmpeg 工具
    3. 将安装包解压后执行即可

    软件特色

    1. 前端采用 BootstrapVanJS 构建,轻量美观
    2. 后端使用 Go 语言开发,数据库采用 SQlite,简化构建和部署过程
    3. 前端通过 p-queue 控制并发请求,加快批量解析速度

    其他说明

    • 本程序不支持也不建议 HTTP 代理,直接使用国内网络访问能提升批量解析的成功率和稳定性。

    打包可执行文件

    git clone https://github.com/iuroc/bilidown
    cd bilidown/client
    pnpm install
    pnpm build
    cd ../server
    go mod tidy
    CGO_ENABLED=1 go build

    交叉编译

    说明

    • 镜像名称:iuroc/cgo-cross-build
    • 支持的系统架构
      • linux/amd64
      • windows/amd64
      • windows/386
      • windows/arm64
      • darwin/amd64
      • darwin/arm64

    拉取镜像和项目源码

    docker pull iuroc/cgo-cross-build:latest
    git clone https://github.com/iuroc/bilidown

    交叉编译发行版

    • 执行 goreleaser 命令时将自动执行 pnpm buildgo mod tidy
    cd bilidown/server
    # [交叉编译 Releases]
    docker run --rm -v .:/usr/src/data iuroc/cgo-cross-build goreleaser release --snapshot --clean
    
    # [交互式终端]
    cd bilidown
    docker run --rm -it -v .:/usr/src/data iuroc/cgo-cross-build

    编译指定系统架构

    cd bilidown/server
    
    # [DEFAULT: linux-amd64]
    docker run --rm -v .:/usr/src/data iuroc/cgo-cross-build go build -o dist/bilidown-linux-amd64/bilidown
    
    # [darwin-amd64]
    docker run --rm -v .:/usr/src/data -e GOOS=darwin -e GOARCH=amd64 -e CC=o64-clang -e CGO_ENABLED=1 iuroc/cgo-cross-build go build -o dist/bilidown-darwin-amd64/bilidown

    非 Docker 环境编译

    在 Linux amd64 平台上执行 go build 时,您可能需要安装以下依赖包:

    sudo apt install pkg-config gcc libayatana-appindicator3-dev

    开发环境

    # client
    pnpm install
    pnpm dev
    # server
    go build && ./bilidown

    特别感谢

    软件界面

    Star History

    Star History Chart

    Visit original content creator repository https://github.com/iuroc/bilidown
  • pqtl_pipeline_finemap

    pqtl_pipeline_finemap

    Fine mapping analysis within the pQTL pipeline project at Human Technopole, Milan, Italy

    We started this analysis pipeline in early April 2024. We adopted the Next-Flow (NF) pipeline developed by the Statistical Genomics team at Human Technopole and deployed it in Snakemake (SMK). We independtly validated each of the multiple analyses stated below before incorporating it in SMK.

    Locus Breaker

    We incorporated Locus Breaker (LB) function written in R (see publication PMID:) for example meta-analysis GWAS results of the proteins and we deployed it in SMK in mid April 2024.

    COJO Conditional Analysis

    Once running the pipeline, rule run_cojo will generate output files below:

    • list of independent variants resulted from GCTA cojo-slct (TSV/CSV)
    • conditional dataset for each independent signal resulted from GCTA cojo-cond (RDS)
    • fine-mapping results using coloc::coloc.ABF function, containing values such as l-ABF, posterior probabilities (PPI) for each variant (RDS)
    • colocalization info table containing credible set variants (with cumulative PPI > 0.99) for each independent variant
    • regional association plots

    These outputs are going to be stored in workspace_path provided by the user in config_finemap.yaml and stored in such directory: <workspace_path>/results/*/cojo/

    Colocalization of Two Proteins

    We performed colocalization (Giambartolomei et al., 2014) across the pQTL signals. To meet the fundamental assumption of colocalization of only one causal variant per locus, we used conditional datasets, thus performing one colocalization test per pair of independent SNPs in 2 overlapping loci. For each regional association and each target SNP, we identified a credible set as the set of variants with posterior inclusion probability (PIP) > 0.99 within the region. More precisely, using the conditional dataset, we computed Approximate Bayes Factors (ABF) with the ‘process.dataset’ function in the coloc v5.2.3 R package and calculated posterior probabilities by normalizing ABFs across variants. Variants were ranked, and those with a cumulative posterior probability exceeding 0.99 were included in the credible sets. Among XXX protein pairs with overlapping loci, XXX protein pairs sharing a credible set variant were then tested for colocalization using the ‘coloc.abf’ function. Colocalized pairs were identified when the posterior probability for hypothesis 4 assuming a shared causal variant for two proteins exceeded 0.80.

    New Features on Top of NF pipeline

    We also incorporated new features such as exclusion of signals in HLA and NLRP12 regions from the results and follow-up analyses, allowing user to decide through the configuration file.

    NOTE

    This SMK pipeline which is designed for pQTLs project does not include munging and alignment of input GWAS summary files. Therefore, it is a MUST to have your GWAS results completely harmonized by your genotype data. Eg. variants IDs, refrence/alternate (effect/other) alleles should be concordant across your input files. Our GWAS summary stats from REGENIE are already aligned with QC pipeline (adopted by GWASLab) developed by pQTL analysts team at Health Data Science Center.

    How to run the pipeline:

    You can use the default configuration file in config/config_finemap.yaml. Otherwise, prepare your configuration in config/ folder. Then, make sure that configfile in workflow/Snakefile matches with your newly created config file name. Then, run the pipeline by typing below command in bash.

    sbatch submit.sh

    Not interested to run colocalization?

    If you want to skip running colocalization with your traits, uncomment this #--until collect_credible_sets in Makefile. If you want to skip both COJO and colocalization and only run locus breaker, then change previous option in Makefile to --until collect_loci and run the pipeline as mentioned before.

    Workflow example

    example workflow

    Visit original content creator repository https://github.com/ht-diva/pqtl_pipeline_finemap
  • pqtl_pipeline_finemap

    pqtl_pipeline_finemap

    Fine mapping analysis within the pQTL pipeline project at Human Technopole, Milan, Italy

    We started this analysis pipeline in early April 2024. We adopted the Next-Flow (NF) pipeline developed by the Statistical Genomics team at Human Technopole and deployed it in Snakemake (SMK). We independtly validated each of the multiple analyses stated below before incorporating it in SMK.

    Locus Breaker

    We incorporated Locus Breaker (LB) function written in R (see publication PMID:) for example meta-analysis GWAS results of the proteins and we deployed it in SMK in mid April 2024.

    COJO Conditional Analysis

    Once running the pipeline, rule run_cojo will generate output files below:

    • list of independent variants resulted from GCTA cojo-slct (TSV/CSV)
    • conditional dataset for each independent signal resulted from GCTA cojo-cond (RDS)
    • fine-mapping results using coloc::coloc.ABF function, containing values such as l-ABF, posterior probabilities (PPI) for each variant (RDS)
    • colocalization info table containing credible set variants (with cumulative PPI > 0.99) for each independent variant
    • regional association plots

    These outputs are going to be stored in workspace_path provided by the user in config_finemap.yaml and stored in such directory: <workspace_path>/results/*/cojo/

    Colocalization of Two Proteins

    We performed colocalization (Giambartolomei et al., 2014) across the pQTL signals. To meet the fundamental assumption of colocalization of only one causal variant per locus, we used conditional datasets, thus performing one colocalization test per pair of independent SNPs in 2 overlapping loci. For each regional association and each target SNP, we identified a credible set as the set of variants with posterior inclusion probability (PIP) > 0.99 within the region. More precisely, using the conditional dataset, we computed Approximate Bayes Factors (ABF) with the ‘process.dataset’ function in the coloc v5.2.3 R package and calculated posterior probabilities by normalizing ABFs across variants. Variants were ranked, and those with a cumulative posterior probability exceeding 0.99 were included in the credible sets. Among XXX protein pairs with overlapping loci, XXX protein pairs sharing a credible set variant were then tested for colocalization using the ‘coloc.abf’ function. Colocalized pairs were identified when the posterior probability for hypothesis 4 assuming a shared causal variant for two proteins exceeded 0.80.

    New Features on Top of NF pipeline

    We also incorporated new features such as exclusion of signals in HLA and NLRP12 regions from the results and follow-up analyses, allowing user to decide through the configuration file.

    NOTE

    This SMK pipeline which is designed for pQTLs project does not include munging and alignment of input GWAS summary files. Therefore, it is a MUST to have your GWAS results completely harmonized by your genotype data. Eg. variants IDs, refrence/alternate (effect/other) alleles should be concordant across your input files. Our GWAS summary stats from REGENIE are already aligned with QC pipeline (adopted by GWASLab) developed by pQTL analysts team at Health Data Science Center.

    How to run the pipeline:

    You can use the default configuration file in config/config_finemap.yaml. Otherwise, prepare your configuration in config/ folder. Then, make sure that configfile in workflow/Snakefile matches with your newly created config file name. Then, run the pipeline by typing below command in bash.

    sbatch submit.sh

    Not interested to run colocalization?

    If you want to skip running colocalization with your traits, uncomment this #--until collect_credible_sets in Makefile. If you want to skip both COJO and colocalization and only run locus breaker, then change previous option in Makefile to --until collect_loci and run the pipeline as mentioned before.

    Workflow example

    example workflow

    Visit original content creator repository https://github.com/ht-diva/pqtl_pipeline_finemap
  • reearth-classic

    Originally called Re:Earth, the product was separated into Visualiser and Classic following a major revamp. This repository is Classic. This repository is intended to run the existing Re:Earth cloud service without changing the code base of Re:Earth too significantly. No new features will be added, only bug fixes and other maintenance.

    See Re:Earth Visualizer repo to access the latest Re:Earth!


    Logo

    Website · Documentation · Figma · Discord

    reearth-demo-short.mp4

    💡 We are hiring full-time OSS comitters! https://eukarya.io/join

    Features

    • 🔌 Highly extensible thanks to the robust plugin system
    • 💻 Super handy being browser-based
    • 💪 Supports standard GIS data formats (CSV, KML, CZML, GeoJSON and shapefile)
    • 📢 Easily make a project public
    • ✨ Freely style the map

    Environment

    OS

    Windows 10+ Apple macOS 10.12 (macOS Sierra)+ ChromeOS iOS 11+ Android 10+ Linux (with the desktop)

    Web Browsers

    Edge
    Edge
    Firefox
    Firefox
    Chrome
    Chrome
    Safari
    Safari
    iOS Safari
    iOS Safari
    Chrome for Android
    Chrome for Android
    91+ 57+ 58+ 11+ last 2 versions last 2 versions

    Community

    Discord: Feel free to come in!

    Contributing

    See the contributing guide.

    Contact

    Re:Earth core committers: community@reearth.io

    License

    Distributed under the Apache-2.0 License. See LICENSE for more information.

    Visit original content creator repository https://github.com/reearth/reearth-classic
  • homeswitch

                                     Apache License
                               Version 2.0, January 2004
                            http://www.apache.org/licenses/
    
       TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
    
       1. Definitions.
    
          "License" shall mean the terms and conditions for use, reproduction,
          and distribution as defined by Sections 1 through 9 of this document.
    
          "Licensor" shall mean the copyright owner or entity authorized by
          the copyright owner that is granting the License.
    
          "Legal Entity" shall mean the union of the acting entity and all
          other entities that control, are controlled by, or are under common
          control with that entity. For the purposes of this definition,
          "control" means (i) the power, direct or indirect, to cause the
          direction or management of such entity, whether by contract or
          otherwise, or (ii) ownership of fifty percent (50%) or more of the
          outstanding shares, or (iii) beneficial ownership of such entity.
    
          "You" (or "Your") shall mean an individual or Legal Entity
          exercising permissions granted by this License.
    
          "Source" form shall mean the preferred form for making modifications,
          including but not limited to software source code, documentation
          source, and configuration files.
    
          "Object" form shall mean any form resulting from mechanical
          transformation or translation of a Source form, including but
          not limited to compiled object code, generated documentation,
          and conversions to other media types.
    
          "Work" shall mean the work of authorship, whether in Source or
          Object form, made available under the License, as indicated by a
          copyright notice that is included in or attached to the work
          (an example is provided in the Appendix below).
    
          "Derivative Works" shall mean any work, whether in Source or Object
          form, that is based on (or derived from) the Work and for which the
          editorial revisions, annotations, elaborations, or other modifications
          represent, as a whole, an original work of authorship. For the purposes
          of this License, Derivative Works shall not include works that remain
          separable from, or merely link (or bind by name) to the interfaces of,
          the Work and Derivative Works thereof.
    
          "Contribution" shall mean any work of authorship, including
          the original version of the Work and any modifications or additions
          to that Work or Derivative Works thereof, that is intentionally
          submitted to Licensor for inclusion in the Work by the copyright owner
          or by an individual or Legal Entity authorized to submit on behalf of
          the copyright owner. For the purposes of this definition, "submitted"
          means any form of electronic, verbal, or written communication sent
          to the Licensor or its representatives, including but not limited to
          communication on electronic mailing lists, source code control systems,
          and issue tracking systems that are managed by, or on behalf of, the
          Licensor for the purpose of discussing and improving the Work, but
          excluding communication that is conspicuously marked or otherwise
          designated in writing by the copyright owner as "Not a Contribution."
    
          "Contributor" shall mean Licensor and any individual or Legal Entity
          on behalf of whom a Contribution has been received by Licensor and
          subsequently incorporated within the Work.
    
       2. Grant of Copyright License. Subject to the terms and conditions of
          this License, each Contributor hereby grants to You a perpetual,
          worldwide, non-exclusive, no-charge, royalty-free, irrevocable
          copyright license to reproduce, prepare Derivative Works of,
          publicly display, publicly perform, sublicense, and distribute the
          Work and such Derivative Works in Source or Object form.
    
       3. Grant of Patent License. Subject to the terms and conditions of
          this License, each Contributor hereby grants to You a perpetual,
          worldwide, non-exclusive, no-charge, royalty-free, irrevocable
          (except as stated in this section) patent license to make, have made,
          use, offer to sell, sell, import, and otherwise transfer the Work,
          where such license applies only to those patent claims licensable
          by such Contributor that are necessarily infringed by their
          Contribution(s) alone or by combination of their Contribution(s)
          with the Work to which such Contribution(s) was submitted. If You
          institute patent litigation against any entity (including a
          cross-claim or counterclaim in a lawsuit) alleging that the Work
          or a Contribution incorporated within the Work constitutes direct
          or contributory patent infringement, then any patent licenses
          granted to You under this License for that Work shall terminate
          as of the date such litigation is filed.
    
       4. Redistribution. You may reproduce and distribute copies of the
          Work or Derivative Works thereof in any medium, with or without
          modifications, and in Source or Object form, provided that You
          meet the following conditions:
    
          (a) You must give any other recipients of the Work or
              Derivative Works a copy of this License; and
    
          (b) You must cause any modified files to carry prominent notices
              stating that You changed the files; and
    
          (c) You must retain, in the Source form of any Derivative Works
              that You distribute, all copyright, patent, trademark, and
              attribution notices from the Source form of the Work,
              excluding those notices that do not pertain to any part of
              the Derivative Works; and
    
          (d) If the Work includes a "NOTICE" text file as part of its
              distribution, then any Derivative Works that You distribute must
              include a readable copy of the attribution notices contained
              within such NOTICE file, excluding those notices that do not
              pertain to any part of the Derivative Works, in at least one
              of the following places: within a NOTICE text file distributed
              as part of the Derivative Works; within the Source form or
              documentation, if provided along with the Derivative Works; or,
              within a display generated by the Derivative Works, if and
              wherever such third-party notices normally appear. The contents
              of the NOTICE file are for informational purposes only and
              do not modify the License. You may add Your own attribution
              notices within Derivative Works that You distribute, alongside
              or as an addendum to the NOTICE text from the Work, provided
              that such additional attribution notices cannot be construed
              as modifying the License.
    
          You may add Your own copyright statement to Your modifications and
          may provide additional or different license terms and conditions
          for use, reproduction, or distribution of Your modifications, or
          for any such Derivative Works as a whole, provided Your use,
          reproduction, and distribution of the Work otherwise complies with
          the conditions stated in this License.
    
       5. Submission of Contributions. Unless You explicitly state otherwise,
          any Contribution intentionally submitted for inclusion in the Work
          by You to the Licensor shall be under the terms and conditions of
          this License, without any additional terms or conditions.
          Notwithstanding the above, nothing herein shall supersede or modify
          the terms of any separate license agreement you may have executed
          with Licensor regarding such Contributions.
    
       6. Trademarks. This License does not grant permission to use the trade
          names, trademarks, service marks, or product names of the Licensor,
          except as required for reasonable and customary use in describing the
          origin of the Work and reproducing the content of the NOTICE file.
    
       7. Disclaimer of Warranty. Unless required by applicable law or
          agreed to in writing, Licensor provides the Work (and each
          Contributor provides its Contributions) on an "AS IS" BASIS,
          WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
          implied, including, without limitation, any warranties or conditions
          of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
          PARTICULAR PURPOSE. You are solely responsible for determining the
          appropriateness of using or redistributing the Work and assume any
          risks associated with Your exercise of permissions under this License.
    
       8. Limitation of Liability. In no event and under no legal theory,
          whether in tort (including negligence), contract, or otherwise,
          unless required by applicable law (such as deliberate and grossly
          negligent acts) or agreed to in writing, shall any Contributor be
          liable to You for damages, including any direct, indirect, special,
          incidental, or consequential damages of any character arising as a
          result of this License or out of the use or inability to use the
          Work (including but not limited to damages for loss of goodwill,
          work stoppage, computer failure or malfunction, or any and all
          other commercial damages or losses), even if such Contributor
          has been advised of the possibility of such damages.
    
       9. Accepting Warranty or Additional Liability. While redistributing
          the Work or Derivative Works thereof, You may choose to offer,
          and charge a fee for, acceptance of support, warranty, indemnity,
          or other liability obligations and/or rights consistent with this
          License. However, in accepting such obligations, You may act only
          on Your own behalf and on Your sole responsibility, not on behalf
          of any other Contributor, and only if You agree to indemnify,
          defend, and hold each Contributor harmless for any liability
          incurred by, or claims asserted against, such Contributor by reason
          of your accepting any such warranty or additional liability.
    
       END OF TERMS AND CONDITIONS
    
       APPENDIX: How to apply the Apache License to your work.
    
          To apply the Apache License to your work, attach the following
          boilerplate notice, with the fields enclosed by brackets "[]"
          replaced with your own identifying information. (Don't include
          the brackets!)  The text should be enclosed in the appropriate
          comment syntax for the file format. We also recommend that a
          file or class name and description of purpose be included on the
          same "printed page" as the copyright notice for easier
          identification within third-party archives.
    
       Copyright [2018] [Moritz Kanzler]
    
       Licensed under the Apache License, Version 2.0 (the "License");
       you may not use this file except in compliance with the License.
       You may obtain a copy of the License at
    
           http://www.apache.org/licenses/LICENSE-2.0
    
       Unless required by applicable law or agreed to in writing, software
       distributed under the License is distributed on an "AS IS" BASIS,
       WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
       See the License for the specific language governing permissions and
       limitations under the License.
    

    Visit original content creator repository
    https://github.com/Mo0812/homeswitch

  • DiskClone

    Raw disk clone tool

    A free and open-source raw disk clone tool written in Python. It creates a direct sector by sector block copy. It is able to skip bad sectors. No filesystem inspection is involved, so it is filesystem independent.

    Use this tool when you are running on a server OS and do not want to pay for commercial tools. The operation mechanism of this tool is very simple and straightforward.

    It is also helpful in cases where your alternative tool would stop working upon encountering bad sectors. Some commercial disk clone tools cannot handle bad sectors for some reason.

    Note: You cannot use this tool to clone the active OS disk since it assumes the source disk is made readonly before cloning starts.

    INSPECT THE SOURCE CODE, UNDERSTAND WHAT IT DOES AND VERIFY THAT THE CODE IS CORRECT. THEN USE WITH CARE. I AM NOT RESPONSIBLE IN ANY WAY IF YOU LOSE YOUR DATA. ALL DATA ON DESTINATION DISK WILL BE OVERWRITTEN.

    Usage

    python diskclone.py SourceDisk DestinationDisk

    Under Windows:

    python diskclone.py “\\.\PhysicalDrive0” “\\.\PhysicalDrive1”

    or

    Under Linux:

    python diskclone.py “/dev/ploop12345” “/dev/ploop67890”

    A Python 2 or 3 installation is required. There are package dependencies:

    – psutil

    – pywin32 (under Windows OS only)

    Roadmap

    Bad sector recovery functionality. Sometimes bad sectors can be recovered by attempting to read them repeatedly. The necessary changes to existing code are essentially just a few lines.

    Licence

    Version 1.0.1

    Copyright: Roland Pihlakas, 2023, roland@simplify.ee

    Licence: LGPL 2.1

    You can obtain a copy of this free software from https://github.com/levitation-opensource/DiskScan/

    State

    Ready to use. Maintained and in active use.

    Analytics

    Visit original content creator repository
    https://github.com/levitation-opensource/DiskClone