-
Notifications
You must be signed in to change notification settings - Fork 563
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question: Is it possible to manually merge multiple resultset.json's and generate output? #219
Comments
Bumping this. We parallelize our suite across many boxes and I'm taking a crack at using the above script to merge results. Is there a built-in way to tackle this issue? |
same as @gotascii we have to merge coverage from parallel builds across VMs before sending it along. Any update on this feature request? |
@gogoLuby wanna make a PR? all your open source code are belong to us |
I'd love to, but I'm confused about the behavior of test.1.json {
"Unit Tests": {
"coverage": {
"lib/A.rb": [1, 1, null, null, 0, 0]
,"lib/B.rb": [1, null, 0]
},
"timestamp": 123
}
} test.2.json {
"Unit Tests": {
"coverage": {
"lib/A.rb": [0, 0, null, null, 1, 1]
,"lib/C.rb": [null, 1, 0]
},
"timestamp": 456
}
} merge_coverage.rb require 'simplecov'
reports_dir = './'
coverage_file_pattern = 'test.*.json'
json_files = Dir[File.join(reports_dir, coverage_file_pattern)]
merged_hash = {}
json_files.each do |json_file|
SimpleCov::JSON.parse(File.read(json_file)).each do |command_name, data|
puts "#{json_file} –> '#{command_name}' –> #{data}"
merged_hash = SimpleCov::Result.from_hash(command_name => data).original_result.merge_resultset(merged_hash)
end
end
puts "merged_hash=#{merged_hash}"
merged_result = SimpleCov::Result.new(merged_hash)
puts "merged_result=#{merged_result.to_hash}"
|
I think the original issue can now be solved using: SimpleCov.command_name "test:#{ENV['SOME_UNIQ_THING_FOR_A_BUILD']}"
# SOME_UNIQ_THING_FOR_A_BUILD has to be uniq for a build job and not all runs of a job. and share the |
+1 @slowjack2k What if it is not possible to share |
@grzesiek It is always possible to share a file. The only question is how much time you have to spend to get it working. I don't know docker in depth but as far as I know you can use a data container to share data between multiple docker machines. Also you can use a gluster fs or use rsync. Within your build scripts you have to ensure that only one test run access these file. |
@slowjack2k Those solutions won't work for me. I think that better solution would be to add support for this feature in |
👍 would make life a lot easier. |
After some tests, it look like it may be quite easy to have something like this in SimpleCov. I'm currently testing following patch for module SimpleCov
module ResultMerger
class << self
def resultset_files
Dir.glob(File.join(SimpleCov.coverage_path, '*', '.resultset.json'))
end
def resultset_hashes
resultset_files.map do |path|
begin
JSON.parse(File.read(path))
rescue
{}
end
end
end
def resultset
resultset_hashes.reduce({}, :merge)
end
end
end
end |
👍 for having built-in support for merging.. The scenario I have is we are using CircleCI to run our tests which can run them in parallel.. What they provide is a means to "log into" another node and fetch data from it. The the current plan for building out a unified coverage report is to
@grzesiek our changes seem the simplest, how has that been working for you? |
@urkle We are using the patch mentioned above to calculate code coverage for GitLab Community Edition, because GitLab CI also supports parallelization. We have an issue about contributing this back to Simplecov but it is not scheduled yet. |
@bf4 thanks! I'll test this out tomorrow and verify it works as expected. |
@urkle please come back and say if it worked so that we might close this out :) |
@PragTob from my testing it seems that it does the trick. |
Most of the thanks goes to @aroben ! I'll close this one now, if you disagree let me know and we can reopen. |
@bf4 I am not sure how #558 solves this issue. What I'm doing is still something like this: results = []
ARGV.each do |arg|
path = File.join(arg, ".resultset.json")
json = JSON.parse(File.read(path))
json.each do |command_name, data|
results << SimpleCov::Result.from_hash(command_name => data)
end
end
merged_result = SimpleCov::ResultMerger.merge_results(*results)
puts merged_result.covered_percent (I run this with Is this the best way to solve it now? I still feel like I'm depending on SimpleCov's internals that might change in future. |
@jgonera I've expanded on your script and refined it just a tad bit require 'json'
require 'simplecov'
# initialize data members
# and configure simplecov
coverage_results = []
SimpleCov.filters.clear
ARGV.each do |arg|
# load json results from coverage folder
file = File.join(arg, ".resultset.json")
file_results = JSON.parse(File.read(file))
# parse results from coverage file to array
file_results.each do |command, data|
result = SimpleCov::Result.from_hash(command => data)
coverage_results << result
end
end
# merge results from array to results object
merged_results = SimpleCov::ResultMerger.merge_results(*coverage_results)
# save results to file
File.open("./results.json","w") do |f|
f.write(JSON.pretty_generate(merged_results.to_hash()))
end just for readability OCD, saved the resutls to file and cleared the filters in order to ignore project paths a bit(helps if the tests come from different instances and can easily be parsed if aggregated) Hope it helps others. |
FWIW, I have an implementation which separates the coverage gathering from the report building/build failing which I've been meaning to share back as a PR for too long, which has everything you need to merge results. (see .simplecovENV['FULL_BUILD'] ||= ENV['CI']
generate_report = !!(ENV['COVERAGE'] =~ /\Atrue\z/i)
running_ci = !!(ENV['FULL_BUILD'] =~ /\Atrue\z/i)
generate_result = running_ci || generate_report
require_relative 'scripts/coverage_report'
reporter = CoverageReport.new
if generate_report
reporter.configure_to_generate_report!
SimpleCov.at_exit do
reporter.generate_report!
end
end
if generate_result
# only start when generating a result
SimpleCov.start 'app'
STDERR.puts '[COVERAGE] Running'
reporter.configure_to_generate_result!
end
SimpleCov.formatters = reporter.formatters scripts/coverage_report.rb#!/usr/bin/env ruby
require 'json'
class CoverageReport
attr_reader :formatters
def initialize
@formatters = []
end
def configure_to_generate_result!
SimpleCov.configure do
# use_merging true
minimum_coverage 0.0 # disable
maximum_coverage_drop 100.0 # disable
end
SimpleCov.at_exit do
STDERR.puts "[COVERAGE] creating #{File.join(SimpleCov.coverage_dir, '.resultset.json')}"
SimpleCov.result.format!
end
end
def configure_to_generate_report!
@minimum_coverage = ENV.fetch('COVERAGE_MINIMUM') { 100.0 }.to_f.round(2)
SimpleCov.configure do
minimum_coverage @minimum_coverage
# minimum_coverage_by_file 60
# maximum_coverage_drop 1
refuse_coverage_drop
end
@formatters = [SimpleCov::Formatter::HTMLFormatter]
end
def generate_report!
report_dir = SimpleCov.coverage_dir
file = File.join(report_dir, '.resultset.json')
if File.exist?(file)
json = JSON.parse(File.read(file))
result = SimpleCov::Result.from_hash(json)
results = [result]
merged_result = SimpleCov::ResultMerger.merge_results(*results)
merged_result.format!
STDERR.puts "[COVERAGE] merged #{file}; processing..."
process_result(merged_result)
else
abort "No files found to report: #{Dir.glob(report_dir)}"
end
end
# https://github.com/colszowka/simplecov/blob/v0.14.1/lib/simplecov/defaults.rb#L71-L98
def process_result(result)
@exit_status = SimpleCov::ExitCodes::SUCCESS
covered_percent = result.covered_percent.round(2)
covered_percentages = result.covered_percentages.map { |p| p.round(2) }
if @exit_status == SimpleCov::ExitCodes::SUCCESS # No other errors
if covered_percent < SimpleCov.minimum_coverage # rubocop:disable Metrics/BlockNesting
$stderr.printf("Coverage (%.2f%%) is below the expected minimum coverage (%.2f%%).\n", covered_percent, SimpleCov.minimum_coverage)
@exit_status = SimpleCov::ExitCodes::MINIMUM_COVERAGE
elsif covered_percentages.any? { |p| p < SimpleCov.minimum_coverage_by_file } # rubocop:disable Metrics/BlockNesting
$stderr.printf("File (%s) is only (%.2f%%) covered. This is below the expected minimum coverage per file of (%.2f%%).\n", result.least_covered_file, covered_percentages.min, SimpleCov.minimum_coverage_by_file)
@exit_status = SimpleCov::ExitCodes::MINIMUM_COVERAGE
elsif (last_run = SimpleCov::LastRun.read) # rubocop:disable Metrics/BlockNesting
coverage_diff = last_run["result"]["covered_percent"] - covered_percent
if coverage_diff > SimpleCov.maximum_coverage_drop # rubocop:disable Metrics/BlockNesting
$stderr.printf("Coverage has dropped by %.2f%% since the last time (maximum allowed: %.2f%%).\n", coverage_diff, SimpleCov.maximum_coverage_drop)
@exit_status = SimpleCov::ExitCodes::MAXIMUM_COVERAGE_DROP
end
end
end
# Don't overwrite last_run file if refuse_coverage_drop option is enabled and the coverage has dropped
unless @exit_status == SimpleCov::ExitCodes::MAXIMUM_COVERAGE_DROP
SimpleCov::LastRun.write(:result => {:covered_percent => covered_percent})
end
# Force exit with stored status (see github issue #5)
# unless it's nil or 0 (see github issue #281)
Kernel.exit @exit_status if @exit_status && @exit_status > 0
end
end
if __FILE__ == $0
require 'simplecov'
reporter = CoverageReport.new
reporter.configure_to_generate_report!
reporter.generate_report!
end |
Thanks @bf4, very helpful to see it all fit together. FYI - here's the short version I use to just merge different resultset json into one report.
|
EDIT: Got it working by running this code without the coverage var set. This code was adding it's own coverage report because of my .simplecov config. |
@nroose @justinpincar you might want to take a look at #780 |
Hey guys, I am using Tried the above snippets, none works for me. |
@prtk418 if you just wanted to use "mainline" collate, forking the gem and adjusting the required ruby version might help. I'm not aware of anything specifically broken in 2.2. Beyond that, you could try applying what we did to your Ruby version: #780 Alas, 2.2 --> 2.4 upgrade is relatively easy/straight forward afaik so might just wanna upgrade ruby :) |
Hi,
I'm setting up a CI build at the moment, and pushing .resultset json result files to the next build in the line. Is it possible to manually merge these files? Could you point me in the direction of the right commands to hook into doing this?
At the moment I'm rebuilding the logic to do it manually with a preliminary rake task.
But it feels very clunky, even though it works. Has there been plans to optionally allow multiple result_set files?
Cheers,
Tom
The text was updated successfully, but these errors were encountered: