Installing goleft
➜ falconUnzip_assembly bioconda install goleft
zsh: command not found: bioconda
➜ falconUnzip_assembly conda install goleft
Solving environment: done
## Package Plan ##
environment location: /home/urbe/anaconda3
added / updated specs:
- goleft
The following packages will be downloaded:
package | build
---------------------------|-----------------
ca-certificates-2018.8.24 | ha4d7672_0 136 KB conda-forge
certifi-2018.8.24 | py36_1 139 KB conda-forge
goleft-0.1.18 | 1 6.6 MB bioconda
openssl-1.0.2p | h470a237_0 3.5 MB conda-forge
------------------------------------------------------------
Total: 10.4 MB
The following NEW packages will be INSTALLED:
goleft: 0.1.18-1 bioconda
The following packages will be UPDATED:
ca-certificates: 2018.8.13-ha4d7672_0 conda-forge --> 2018.8.24-ha4d7672_0 conda-forge
certifi: 2018.8.13-py36_0 conda-forge --> 2018.8.24-py36_1 conda-forge
openssl: 1.0.2o-h470a237_1 conda-forge --> 1.0.2p-h470a237_0 conda-forge
Proceed ([y]/n)? Y
Downloading and Extracting Packages
ca-certificates-2018 | 136 KB | ####################################################################################################################### | 100%
certifi-2018.8.24 | 139 KB | ####################################################################################################################### | 100%
goleft-0.1.18 | 6.6 MB | ####################################################################################################################### | 100%
openssl-1.0.2p | 3.5 MB | ####################################################################################################################### | 100%
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
Benchmarking a Perl module involves measuring the performance of the module in terms of its execution time and memory usage. This can be done using the Perl Benchmark module, which provides a simple and standardized way to measure and compare the performance of Perl code.
Here is a quick guide to benchmarking a Perl module using the Benchmark module:
Install the Benchmark module: If you don't already have the Benchmark module installed, you can install it using the following command:
cpan Benchmark
Write benchmark code: Write a simple benchmark program that exercises the functionality of your Perl module. This program should take the form of a subroutine that runs the code you want to benchmark. Here's an example of a benchmark subroutine that calls a function from a hypothetical module named "MyModule":
use MyModule; use Benchmark qw(:hireswallclock); sub benchmark_my_module { my $result = MyModule::my_function(); }
Run the benchmark: Call the timethese
function from the Benchmark module to run the benchmark. The timethese
function takes two arguments: the number of iterations to run and a reference to the benchmark subroutine. Here's an example of how to call the timethese
function to run the benchmark for 100 iterations:
timethese(100, { 'MyModule' => \&benchmark_my_module, });
Analyze the results: The timethese
function will output the results of the benchmark, including the number of iterations, the total time taken, and the average time per iteration. You can use these results to compare the performance of your Perl module to other modules or to previous versions of your own module. You can also use other functions from the Benchmark module, such as cmpthese
, to compare the performance of multiple modules or subroutines.
In summary, benchmarking a Perl module involves writing a benchmark program that exercises the functionality of the module, running the benchmark using the timethese
function from the Benchmark module, and analyzing the results to compare the performance of the module. The Benchmark module provides a simple and standardized way to measure and compare the performance of Perl code.