Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Benchmark #8

Merged
merged 43 commits into from
Aug 18, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
43 commits
Select commit Hold shift + click to select a range
3105a20
Merge pull request #2 from membraneframework-labs/ffmpeg-compositor
Janix4000 Jul 27, 2022
d15bd43
minor fix for CI
WojciechBarczynski Jul 28, 2022
7075bcc
improve spec
WojciechBarczynski Jul 28, 2022
64607df
moved pipeline_test
WojciechBarczynski Jul 28, 2022
0b09297
minor changes
WojciechBarczynski Jul 28, 2022
171fb4a
change handle caps
WojciechBarczynski Jul 28, 2022
0a31ee1
code reformat for CI
WojciechBarczynski Jul 28, 2022
50fd24e
add logging and h264 pipeline
WojciechBarczynski Jul 29, 2022
188d84e
add h264 pipeline test
WojciechBarczynski Jul 29, 2022
2d3eaaa
add tests
WojciechBarczynski Jul 29, 2022
7b7a0a8
polish tests
WojciechBarczynski Jul 29, 2022
7035841
Merge branch 'master' into merge-two-videos-template-pipeline
WojciechBarczynski Aug 4, 2022
57d8248
add ex_unit to dialyzer
WojciechBarczynski Aug 5, 2022
1cb4db2
Merge branch 'merge-two-videos-template-pipeline' of https://github.c…
WojciechBarczynski Aug 5, 2022
6a04e17
Merge pull request #1 from membraneframework-labs/merge-two-videos-te…
WojciechBarczynski Aug 5, 2022
5078835
Add merge frames Benchee benchmark
WojciechBarczynski Aug 5, 2022
a82d36d
Add merge frames Benchee benchmark
WojciechBarczynski Aug 5, 2022
2520597
Changed pipelines to send :finshed messages, changed pipelines, demos…
WojciechBarczynski Aug 5, 2022
f80a6b3
Changed pipelines to send :finshed messages, changed pipelines, demos…
WojciechBarczynski Aug 5, 2022
2fe813b
Add pipelines benchmarks, add HTML reports
WojciechBarczynski Aug 8, 2022
2746697
Add pipelines benchmarks, add HTML reports
WojciechBarczynski Aug 8, 2022
08bce9d
Add beamchmarks for raw and h264 pipelines, add benchee benchmarks fo…
WojciechBarczynski Aug 9, 2022
bdf3e69
Add beamchmarks for raw and h264 pipelines, add benchee benchmarks fo…
WojciechBarczynski Aug 9, 2022
e60eb39
Add beamchmarks for raw and h264 pipelines, add benchee benchmarks fo…
WojciechBarczynski Aug 9, 2022
e06fada
Minor changes benchmark
WojciechBarczynski Aug 9, 2022
a7f770b
Update .gitignore
WojciechBarczynski Aug 9, 2022
1dbf112
Update .gitignore
WojciechBarczynski Aug 9, 2022
2981097
fix merge conflicts
WojciechBarczynski Aug 9, 2022
933eb09
Seperated benchmark from main repo, add mix to benchmark folder, mino…
WojciechBarczynski Aug 10, 2022
ea943ad
Seperated benchmark from main repo, add mix to benchmark folder, mino…
WojciechBarczynski Aug 10, 2022
d061b51
Merge branch 'benchmark' of https://github.com/membraneframework-labs…
WojciechBarczynski Aug 10, 2022
c6012bb
fix demo_scripts
WojciechBarczynski Aug 10, 2022
aceb882
Clean up
WojciechBarczynski Aug 12, 2022
c79dc90
Resolve merge conflicts, add benchmark readme
WojciechBarczynski Aug 16, 2022
d369239
Fix benchmark readme images linking
WojciechBarczynski Aug 16, 2022
c804b25
Fix benchmark readme images linking
WojciechBarczynski Aug 16, 2022
51698eb
Fix benchmark readme
WojciechBarczynski Aug 16, 2022
6df9224
Fix benchmark readme
WojciechBarczynski Aug 16, 2022
5098112
Add benchmark.exs for running multiple benchmarks at once, improve re…
WojciechBarczynski Aug 17, 2022
ebff5ab
Add benchmark.exs exit codes handling
WojciechBarczynski Aug 18, 2022
df78345
Improve readme, fix typo in h264 pipeline benchee
WojciechBarczynski Aug 18, 2022
67e5731
Merge branch 'master' into benchmark
WojciechBarczynski Aug 18, 2022
f457974
Fix end bracket in .vsdoe/settings.json
WojciechBarczynski Aug 18, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,9 @@ compile_commands.json
bundlex.sh
bundlex.bat

# tests
test/fixtures/tmp_dir

# Dir generated by tmp_dir ExUnit tag
/tmp/

Expand Down
180 changes: 180 additions & 0 deletions benchmark/.credo.exs
Original file line number Diff line number Diff line change
@@ -0,0 +1,180 @@
# This file contains the configuration for Credo and you are probably reading
# this after creating it with `mix credo.gen.config`.
#
# If you find anything wrong or unclear in this file, please report an
# issue on GitHub: https://github.com/rrrene/credo/issues
#
%{
#
# You can have as many configs as you like in the `configs:` field.
configs: [
%{
#
# Run any config using `mix credo -C <name>`. If no config name is given
# "default" is used.
#
name: "default",
#
# These are the files included in the analysis:
files: %{
#
# You can give explicit globs or simply directories.
# In the latter case `**/*.{ex,exs}` will be used.
#
included: [
"lib/"
],
excluded: [~r"/_build/", ~r"/deps/", ~r"/tmp_dir/"]
},
#
# Load and configure plugins here:
#
plugins: [],
#
# If you create your own checks, you must specify the source files for
# them here, so they can be loaded by Credo before running the analysis.
#
requires: [],
#
# If you want to enforce a style guide and need a more traditional linting
# experience, you can change `strict` to `true` below:
#
strict: false,
#
# To modify the timeout for parsing files, change this value:
#
parse_timeout: 5000,
#
# If you want to use uncolored output by default, you can change `color`
# to `false` below:
#
color: true,
#
# You can customize the parameters of any check by adding a second element
# to the tuple.
#
# To disable a check put `false` as second element:
#
# {Credo.Check.Design.DuplicatedCode, false}
#
checks: [
#
## Consistency Checks
#
{Credo.Check.Consistency.ExceptionNames, []},
{Credo.Check.Consistency.LineEndings, []},
{Credo.Check.Consistency.ParameterPatternMatching, []},
{Credo.Check.Consistency.SpaceAroundOperators, []},
{Credo.Check.Consistency.SpaceInParentheses, []},
{Credo.Check.Consistency.TabsOrSpaces, []},

#
## Design Checks
#
# You can customize the priority of any check
# Priority values are: `low, normal, high, higher`
#
{Credo.Check.Design.AliasUsage,
[priority: :low, if_nested_deeper_than: 2, if_called_more_often_than: 0]},
# You can also customize the exit_status of each check.
# If you don't want TODO comments to cause `mix credo` to fail, just
# set this value to 0 (zero).
#
{Credo.Check.Design.TagTODO, [exit_status: 0]},
{Credo.Check.Design.TagFIXME, []},

#
## Readability Checks
#
{Credo.Check.Readability.AliasOrder, [priority: :normal]},
{Credo.Check.Readability.FunctionNames, []},
{Credo.Check.Readability.LargeNumbers, []},
{Credo.Check.Readability.MaxLineLength, [priority: :low, max_length: 120]},
{Credo.Check.Readability.ModuleAttributeNames, []},
{Credo.Check.Readability.ModuleDoc, []},
{Credo.Check.Readability.ModuleNames, []},
{Credo.Check.Readability.ParenthesesInCondition, []},
{Credo.Check.Readability.ParenthesesOnZeroArityDefs, parens: true},
{Credo.Check.Readability.PredicateFunctionNames, []},
{Credo.Check.Readability.PreferImplicitTry, []},
{Credo.Check.Readability.RedundantBlankLines, []},
{Credo.Check.Readability.Semicolons, []},
{Credo.Check.Readability.SpaceAfterCommas, []},
{Credo.Check.Readability.StringSigils, []},
{Credo.Check.Readability.TrailingBlankLine, []},
{Credo.Check.Readability.TrailingWhiteSpace, []},
{Credo.Check.Readability.UnnecessaryAliasExpansion, []},
{Credo.Check.Readability.VariableNames, []},
{Credo.Check.Readability.WithSingleClause, false},

#
## Refactoring Opportunities
#
{Credo.Check.Refactor.CondStatements, []},
{Credo.Check.Refactor.CyclomaticComplexity, []},
{Credo.Check.Refactor.FunctionArity, []},
{Credo.Check.Refactor.LongQuoteBlocks, []},
{Credo.Check.Refactor.MapInto, false},
{Credo.Check.Refactor.MatchInCondition, []},
{Credo.Check.Refactor.NegatedConditionsInUnless, []},
{Credo.Check.Refactor.NegatedConditionsWithElse, []},
{Credo.Check.Refactor.Nesting, []},
{Credo.Check.Refactor.UnlessWithElse, []},
{Credo.Check.Refactor.WithClauses, []},

#
## Warnings
#
{Credo.Check.Warning.BoolOperationOnSameValues, []},
{Credo.Check.Warning.ExpensiveEmptyEnumCheck, []},
{Credo.Check.Warning.IExPry, []},
{Credo.Check.Warning.IoInspect, []},
{Credo.Check.Warning.LazyLogging, false},
{Credo.Check.Warning.MixEnv, []},
{Credo.Check.Warning.OperationOnSameValues, []},
{Credo.Check.Warning.OperationWithConstantResult, []},
{Credo.Check.Warning.RaiseInsideRescue, []},
{Credo.Check.Warning.UnusedEnumOperation, []},
{Credo.Check.Warning.UnusedFileOperation, []},
{Credo.Check.Warning.UnusedKeywordOperation, []},
{Credo.Check.Warning.UnusedListOperation, []},
{Credo.Check.Warning.UnusedPathOperation, []},
{Credo.Check.Warning.UnusedRegexOperation, []},
{Credo.Check.Warning.UnusedStringOperation, []},
{Credo.Check.Warning.UnusedTupleOperation, []},
{Credo.Check.Warning.UnsafeExec, []},

#
# Checks scheduled for next check update (opt-in for now, just replace `false` with `[]`)

#
# Controversial and experimental checks (opt-in, just replace `false` with `[]`)
#
{Credo.Check.Readability.StrictModuleLayout,
priority: :normal, order: ~w/shortdoc moduledoc behaviour use import require alias/a},
{Credo.Check.Consistency.MultiAliasImportRequireUse, false},
{Credo.Check.Consistency.UnusedVariableNames, force: :meaningful},
{Credo.Check.Design.DuplicatedCode, false},
{Credo.Check.Readability.AliasAs, false},
{Credo.Check.Readability.MultiAlias, false},
{Credo.Check.Readability.Specs, []},
{Credo.Check.Readability.SinglePipe, false},
{Credo.Check.Readability.WithCustomTaggedTuple, false},
{Credo.Check.Refactor.ABCSize, false},
{Credo.Check.Refactor.AppendSingleItem, false},
{Credo.Check.Refactor.DoubleBooleanNegation, false},
{Credo.Check.Refactor.ModuleDependencies, false},
{Credo.Check.Refactor.NegatedIsNil, false},
{Credo.Check.Refactor.PipeChainStart, false},
{Credo.Check.Refactor.VariableRebinding, false},
{Credo.Check.Warning.LeakyEnvironment, false},
{Credo.Check.Warning.MapGetUnsafePass, false},
{Credo.Check.Warning.UnsafeToAtom, false}

#
# Custom checks can be created using `mix credo.gen.check`.
#
]
}
]
}
5 changes: 5 additions & 0 deletions benchmark/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# benchamrk
/_build
/deps
/tmp_dir
/results
60 changes: 60 additions & 0 deletions benchmark/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
# Benchmarks for Membrane Video Compositor Plugin

## Current benchmarks

1. Benchee (run time and memory usage benchmarks):
- h264 pipeline benchmark ```lib\benchee\h264_pipeline.exs```
- raw pipeline benchmark ```lib\benchee\raw_pipeline.exs```
- merge frames benchmark ```lib\benchee\merge_frames.exs```
2. Beamchmark (reductions / context switches / cpu and memory usage / schedulers):
- h264 pipeline benchmark ```lib\beamchmark\h264_pipeline.exs```
- raw pipeline benchmark ```lib\beamchmark\raw_pipeline.exs```
## How to run benchmarks:

1. Enter benchmark folder ```cd benchmark```
2. Run ```mix deps.get``` command
3. Run command for benchmarks:
1. For running packs of benchmarks:
- for all benchmarks (estimated total run time - 27min 30s): ```mix run benchmark.exs```
- for benchee benchmarks (estimated total run time - 24min 30s): ```mix run benchmark.exs benchee```
- for beamchmark benchmarks (estimated total run time - 3min): ```mix run benchmark.exs beamchmark```
2. For running single benchmarks:
- benchee benchmarks:
- for measuring frame composition performance (estimated total run time - 9min 40s): ```mix run lib/benchee/merge_frames.exs```
- for measuring raw pipeline performance (estimated total run time - 9min 40s): ```mix run lib/benchee/raw_pipeline.exs```
- for measuring h264 pipeline performance (estimated total run time - 5min 10s): ```mix run lib/benchee/h264_pipeline.exs```
- beamchmark benchmarks:
- for measuring raw pipeline performance (estimated total run time - 3min): ```mix run lib/beamchmark/raw_pipeline.exs```
- for measuring h264 pipeline performance (estimated total run time - 3min): ```mix run lib/beamchmark/h264_pipeline.exs```
4. Results will be displayed in console log and saved in html website saved at "results" directory

## How to modify test length:

1. Benchee: </br>
- Modify parameters in ```Benchee.run()``` function:
- ```warmup``` for time of benchmark warmup
- ```time``` for time of pipeline performance measurement
- ```memory_time``` for time of pipeline memory usage measurement
2. Beamchmark: </br>
- Modify ```benchmark_duration``` parameter in ```benchmarks_options``` map.

## Example benchmarks results:

<h3 align="center"> Benchee pipelines results: </h3>

720p | 1080p
:-------------------------:|:-------------------------:
![Benchee h264 pipeline 720p 30s 30fps results](assets/results_benchee_h264_pipeline_720p_30s_30fps.png) | ![Benchee h264 pipeline ffmpeg results](assets/results_benchee_h264_pipeline_1080p_30s_30fps.png)

<h3 align="center"> Benchee merge two frames results: </h3>

720p | 1080p
:-------------------------:|:-------------------------:
![Benchee merge two 720p frames results](assets/results_benchee_merge_frames_720p.png) | ![Benchee merge two 1080p frames results:](assets/results_benchee_merge_frames_1080p.png)


<h3 align="center"> Beamchmark pipelines ffmpeg results: </h3>

h264 pipeline | raw pipeline
:-------------------------:|:-------------------------:
![Beamchmark h264 pipeline ffmpeg results](assets/results_beamchmark_h264_pipeline_ffmpeg.png) | ![Beamchmark raw pipeline ffmpeg results](assets/results_beamchmark_raw_pipeline_ffmpeg.png)
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
79 changes: 79 additions & 0 deletions benchmark/benchmark.exs
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
defmodule Membrane.VideoCompositor.Benchmark.RunBenchmarks do
@moduledoc """
Module implements functions for running benchmark.
"""

def run_benchee_benchmarks() do
{h264_pipeline_result, h264_pipeline_exit_code} = System.cmd(
"mix",
["run", "lib/benchee/h264_pipeline.exs"],
stderr_to_stdout: true
)
IO.puts(h264_pipeline_result)

{raw_pipeline_result, raw_pipeline_exit_code} = System.cmd(
"mix",
["run", "lib/benchee/raw_pipeline.exs"],
stderr_to_stdout: true
)
IO.puts(raw_pipeline_result)

{merge_frames_result, merge_frames_exit_code} = System.cmd(
"mix",
["run", "lib/benchee/merge_frames.exs"],
stderr_to_stdout: true
)
IO.puts(merge_frames_result)

case {h264_pipeline_exit_code, raw_pipeline_exit_code, merge_frames_exit_code} do
{0, 0, 0} ->
:ok
_other ->
:error
end
end

def run_beamchmark_benchmarks() do
{h264_pipeline_result, h264_pipeline_exit_code} = System.cmd(
"mix",
["run", "lib/beamchmark/h264_pipeline.exs"],
stderr_to_stdout: true
)
IO.puts(h264_pipeline_result)

{raw_pipeline_result, raw_pipeline_exit_code} = System.cmd(
"mix",
["run", "lib/beamchmark/raw_pipeline.exs"],
stderr_to_stdout: true
)
IO.puts(raw_pipeline_result)

case {h264_pipeline_exit_code, raw_pipeline_exit_code} do
{0, 0} ->
:ok
_other ->
:error
end
end
end

benchmark_type = System.argv()

alias Membrane.VideoCompositor.Benchmark.RunBenchmarks

case benchmark_type do
["benchee"] ->
RunBenchmarks.run_benchee_benchmarks
["beamchmark"] ->
RunBenchmarks.run_beamchmark_benchmarks
_other ->
benchee_exit_status = RunBenchmarks.run_benchee_benchmarks
beamchmark_exit_status = RunBenchmarks.run_beamchmark_benchmarks

case {benchee_exit_status, beamchmark_exit_status} do
{:ok, :ok} ->
:ok
_other ->
:error
end
end
Loading