シンプルなオンラインツール

general

動画変換ツール完全ガイド|MP4・WebM・AVI対応の高品質ビデオ処理技術

動画フォーマット変換、コーデック選択、解像度変更、フレームレート調整、字幕処理、ストリーミング最適化まで、プロ級の動画変換技術を4500字で徹底解説

18分で読む
動画変換ツール完全ガイド|MP4・WebM・AVI対応の高品質ビデオ処理技術

動画変換ツール完全ガイド

はじめに:動画変換の重要性と最新技術

動画コンテンツの爆発的な増加に伴い、効率的な動画変換技術はますます重要になっています。4K/8K動画、HDR、高フレームレート映像など、高品質コンテンツを様々なデバイスやプラットフォームで最適に配信するには、適切な変換戦略が不可欠です。本記事では、最新のコーデック技術から実践的な変換手法まで、包括的に解説します。

第1章:動画フォーマットとコーデックの理解

1.1 主要コンテナフォーマット

各フォーマットの特性と用途

const videoFormats = {
  MP4: {
    extension: '.mp4',
    mimeType: 'video/mp4',
    codecs: {
      video: ['H.264', 'H.265/HEVC', 'AV1'],
      audio: ['AAC', 'MP3', 'AC3']
    },
    features: {
      streaming: true,
      chapters: true,
      subtitles: true,
      metadata: true
    },
    compatibility: 'universal',
    useCase: '汎用配信、ストリーミング'
  },

  WebM: {
    extension: '.webm',
    mimeType: 'video/webm',
    codecs: {
      video: ['VP8', 'VP9', 'AV1'],
      audio: ['Vorbis', 'Opus']
    },
    features: {
      openSource: true,
      streaming: true,
      alpha: true  // 透過動画対応
    },
    compatibility: 'web browsers',
    useCase: 'Web配信、透過動画'
  },

  MKV: {
    extension: '.mkv',
    mimeType: 'video/x-matroska',
    codecs: {
      video: ['any'],  // あらゆるコーデック対応
      audio: ['any']
    },
    features: {
      multipleStreams: true,
      chapters: true,
      attachments: true,
      subtitles: 'unlimited'
    },
    compatibility: 'limited',
    useCase: 'アーカイブ、高品質保存'
  },

  MOV: {
    extension: '.mov',
    mimeType: 'video/quicktime',
    codecs: {
      video: ['ProRes', 'H.264', 'H.265'],
      audio: ['AAC', 'PCM']
    },
    features: {
      professional: true,
      metadata: 'extensive',
      colorSpace: 'wide'
    },
    compatibility: 'Apple ecosystem',
    useCase: 'プロ編集、Apple製品'
  }
};

1.2 ビデオコーデックの詳細比較

最新コーデック技術の特性

class VideoCodecAnalyzer {
  getCodecProfile(codec, quality) {
    const profiles = {
      'H.264': {
        baseline: {
          profile: 'baseline',
          level: '3.0',
          bitrate: this.calculateH264Bitrate(quality),
          compatibility: 'maximum',
          features: []
        },
        main: {
          profile: 'main',
          level: '4.0',
          bitrate: this.calculateH264Bitrate(quality) * 0.85,
          compatibility: 'high',
          features: ['B-frames', 'CABAC']
        },
        high: {
          profile: 'high',
          level: '5.1',
          bitrate: this.calculateH264Bitrate(quality) * 0.7,
          compatibility: 'modern devices',
          features: ['8x8 transform', 'quantization matrices']
        }
      },

      'H.265': {
        main: {
          profile: 'main',
          level: '4.0',
          bitrate: this.calculateH264Bitrate(quality) * 0.5,
          compatibility: 'newer devices',
          features: ['CTU', 'SAO', 'parallel processing']
        },
        main10: {
          profile: 'main10',
          level: '5.0',
          bitrate: this.calculateH264Bitrate(quality) * 0.5,
          bitDepth: 10,
          compatibility: 'HDR capable',
          features: ['10-bit color', 'HDR']
        }
      },

      'VP9': {
        profile0: {
          profile: 0,
          bitDepth: 8,
          chromaSubsampling: '4:2:0',
          bitrate: this.calculateH264Bitrate(quality) * 0.6,
          compatibility: 'web browsers'
        },
        profile2: {
          profile: 2,
          bitDepth: 10,
          chromaSubsampling: '4:2:0',
          bitrate: this.calculateH264Bitrate(quality) * 0.6,
          compatibility: 'HDR streaming'
        }
      },

      'AV1': {
        main: {
          profile: 'main',
          level: '4.0',
          bitrate: this.calculateH264Bitrate(quality) * 0.4,
          compatibility: 'latest browsers',
          features: ['film grain synthesis', 'screen content coding']
        },
        professional: {
          profile: 'professional',
          bitDepth: 12,
          chromaSubsampling: '4:4:4',
          bitrate: this.calculateH264Bitrate(quality) * 0.45,
          compatibility: 'professional tools'
        }
      }
    };

    return profiles[codec] || profiles['H.264'];
  }

  calculateH264Bitrate(resolution, fps = 30) {
    // H.264基準ビットレート (Mbps)
    const baseBitrates = {
      '480p': { 30: 2.5, 60: 4 },
      '720p': { 30: 5, 60: 7.5 },
      '1080p': { 30: 8, 60: 12 },
      '1440p': { 30: 16, 60: 24 },
      '4K': { 30: 35, 60: 50 },
      '8K': { 30: 80, 60: 120 }
    };

    const key = fps > 30 ? 60 : 30;
    return baseBitrates[resolution]?.[key] || 8;
  }
}

第2章:高品質動画変換の実装

2.1 FFmpegによる高度な変換

プロフェッショナル品質の変換実装

const ffmpeg = require('fluent-ffmpeg');
const path = require('path');

class VideoConverter {
  constructor(options = {}) {
    this.hardwareAcceleration = options.hwaccel || 'auto';
    this.threads = options.threads || 0;  // 0 = auto
  }

  async convertVideo(input, output, options = {}) {
    const {
      codec = 'libx264',
      preset = 'medium',
      crf = 23,  // Constant Rate Factor (0-51, lower = better)
      resolution = null,
      fps = null,
      bitrate = null,
      twoPass = false,
      hdr = false
    } = options;

    return new Promise((resolve, reject) => {
      let command = ffmpeg(input);

      // ハードウェアアクセラレーション
      if (this.hardwareAcceleration !== 'none') {
        command = this.applyHardwareAcceleration(command, codec);
      }

      // ビデオコーデック設定
      command = command.videoCodec(this.getCodecName(codec));

      // 品質設定
      if (bitrate) {
        command = command.videoBitrate(bitrate);
      } else {
        command = command.outputOptions([`-crf ${crf}`]);
      }

      // プリセット
      command = command.outputOptions([`-preset ${preset}`]);

      // 解像度変更
      if (resolution) {
        command = command.size(resolution);
      }

      // フレームレート
      if (fps) {
        command = command.fps(fps);
      }

      // HDR処理
      if (hdr) {
        command = this.applyHDRSettings(command);
      }

      // 2パスエンコーディング
      if (twoPass) {
        return this.twoPassEncode(command, output, options);
      }

      // 実行
      command
        .output(output)
        .on('start', (cmd) => console.log('FFmpeg command:', cmd))
        .on('progress', (progress) => {
          if (options.onProgress) {
            options.onProgress(progress);
          }
        })
        .on('end', () => resolve({ success: true, output }))
        .on('error', reject)
        .run();
    });
  }

  applyHardwareAcceleration(command, codec) {
    const hwAccelMap = {
      'nvidia': {
        decoder: 'h264_cuvid',
        encoder: 'h264_nvenc',
        options: ['-hwaccel', 'cuda']
      },
      'intel': {
        decoder: 'h264_qsv',
        encoder: 'h264_qsv',
        options: ['-hwaccel', 'qsv']
      },
      'amd': {
        decoder: 'h264_amf',
        encoder: 'h264_amf',
        options: ['-hwaccel', 'd3d11va']
      },
      'apple': {
        decoder: 'h264_videotoolbox',
        encoder: 'h264_videotoolbox',
        options: ['-hwaccel', 'videotoolbox']
      }
    };

    const hw = this.detectHardware();
    if (hw && hwAccelMap[hw]) {
      const accel = hwAccelMap[hw];
      return command
        .inputOptions(accel.options)
        .videoCodec(accel.encoder);
    }

    return command;
  }

  async twoPassEncode(command, output, options) {
    const passlogfile = path.join(
      path.dirname(output),
      'ffmpeg2pass'
    );

    // パス1: 分析
    await new Promise((resolve, reject) => {
      command
        .outputOptions([
          '-pass 1',
          `-passlogfile ${passlogfile}`,
          '-f null'
        ])
        .output('/dev/null')
        .on('end', resolve)
        .on('error', reject)
        .run();
    });

    // パス2: エンコード
    return new Promise((resolve, reject) => {
      ffmpeg(options.input)
        .videoCodec(this.getCodecName(options.codec))
        .outputOptions([
          '-pass 2',
          `-passlogfile ${passlogfile}`
        ])
        .output(output)
        .on('end', () => {
          // クリーンアップ
          this.cleanup(passlogfile);
          resolve({ success: true, output });
        })
        .on('error', reject)
        .run();
    });
  }

  applyHDRSettings(command) {
    return command.outputOptions([
      '-pix_fmt yuv420p10le',
      '-color_primaries bt2020',
      '-color_trc smpte2084',
      '-colorspace bt2020nc',
      '-color_range tv',
      '-max_muxing_queue_size 1024'
    ]);
  }

  getCodecName(codec) {
    const codecMap = {
      'H.264': 'libx264',
      'H.265': 'libx265',
      'VP9': 'libvpx-vp9',
      'AV1': 'libaom-av1',
      'ProRes': 'prores_ks'
    };

    return codecMap[codec] || codec;
  }
}

2.2 アダプティブストリーミング用変換

HLSとDASH対応の実装

class AdaptiveStreamingConverter {
  async generateHLS(input, outputDir, options = {}) {
    const {
      variants = [
        { resolution: '426x240', bitrate: '400k', name: '240p' },
        { resolution: '640x360', bitrate: '800k', name: '360p' },
        { resolution: '854x480', bitrate: '1400k', name: '480p' },
        { resolution: '1280x720', bitrate: '2800k', name: '720p' },
        { resolution: '1920x1080', bitrate: '5000k', name: '1080p' }
      ],
      segmentDuration = 6,
      codec = 'H.264'
    } = options;

    const masterPlaylist = [];

    for (const variant of variants) {
      const variantDir = path.join(outputDir, variant.name);
      await fs.mkdir(variantDir, { recursive: true });

      // 各解像度のストリーム生成
      await this.generateVariant(
        input,
        variantDir,
        variant,
        segmentDuration,
        codec
      );

      // マスタープレイリストに追加
      const bandwidth = parseInt(variant.bitrate) * 1000;
      masterPlaylist.push(
        `#EXT-X-STREAM-INF:BANDWIDTH=${bandwidth},RESOLUTION=${variant.resolution}`,
        `${variant.name}/playlist.m3u8`
      );
    }

    // マスタープレイリスト作成
    const masterContent = [
      '#EXTM3U',
      '#EXT-X-VERSION:3',
      ...masterPlaylist
    ].join('\n');

    await fs.writeFile(
      path.join(outputDir, 'master.m3u8'),
      masterContent
    );

    return { success: true, masterPlaylist: 'master.m3u8' };
  }

  async generateVariant(input, outputDir, variant, segmentDuration, codec) {
    return new Promise((resolve, reject) => {
      ffmpeg(input)
        .videoCodec(this.getCodecForStreaming(codec))
        .size(variant.resolution)
        .videoBitrate(variant.bitrate)
        .audioCodec('aac')
        .audioBitrate('128k')
        .outputOptions([
          '-hls_time', segmentDuration,
          '-hls_list_size', 0,
          '-hls_segment_filename', path.join(outputDir, 'segment_%03d.ts'),
          '-start_number', 0,
          // HLS specific flags
          '-hls_flags', 'delete_segments+append_list',
          '-g', segmentDuration * 30,  // GOP size
          '-sc_threshold', 0,
          '-force_key_frames', `expr:gte(t,n_forced*${segmentDuration})`
        ])
        .output(path.join(outputDir, 'playlist.m3u8'))
        .on('end', resolve)
        .on('error', reject)
        .run();
    });
  }

  async generateDASH(input, outputDir, options = {}) {
    const {
      representations = [
        { width: 640, height: 360, bitrate: '500k' },
        { width: 854, height: 480, bitrate: '1000k' },
        { width: 1280, height: 720, bitrate: '2500k' },
        { width: 1920, height: 1080, bitrate: '4500k' }
      ],
      segmentDuration = 4
    } = options;

    const outputs = representations.map((rep, index) => ({
      path: path.join(outputDir, `video_${index}.mp4`),
      ...rep
    }));

    // 各表現を生成
    for (const output of outputs) {
      await this.generateRepresentation(input, output);
    }

    // MPD生成
    await this.generateMPD(outputDir, outputs, segmentDuration);

    return { success: true, manifest: 'manifest.mpd' };
  }

  async generateRepresentation(input, output) {
    return new Promise((resolve, reject) => {
      ffmpeg(input)
        .size(`${output.width}x${output.height}`)
        .videoBitrate(output.bitrate)
        .videoCodec('libx264')
        .audioCodec('aac')
        .outputOptions([
          '-movflags', '+faststart',
          '-keyint_min', 48,
          '-g', 48,
          '-sc_threshold', 0,
          '-profile:v', 'high',
          '-level', '4.0',
          '-use_timeline', 1,
          '-use_template', 1,
          '-window_size', 5,
          '-adaptation_sets', 'id=0,streams=v id=1,streams=a'
        ])
        .output(output.path)
        .on('end', resolve)
        .on('error', reject)
        .run();
    });
  }

  getCodecForStreaming(codec) {
    // ストリーミング用の最適化されたコーデック設定
    const streamingCodecs = {
      'H.264': 'libx264 -profile:v baseline -level 3.0',
      'H.265': 'libx265 -profile:v main',
      'VP9': 'libvpx-vp9 -speed 4',
      'AV1': 'libaom-av1 -cpu-used 8'
    };

    return streamingCodecs[codec] || streamingCodecs['H.264'];
  }
}

2.3 字幕とメタデータ処理

字幕の埋め込みと変換

class SubtitleProcessor {
  async embedSubtitles(videoPath, subtitlePath, outputPath, options = {}) {
    const {
      hardSub = false,  // ハードサブ(焼き込み)
      language = 'jpn',
      fontName = 'Arial',
      fontSize = 24,
      fontColor = 'white',
      backgroundColor = 'black@0.5',
      position = 'bottom'
    } = options;

    return new Promise((resolve, reject) => {
      let command = ffmpeg(videoPath);

      if (hardSub) {
        // ハードサブ(映像に焼き込み)
        const subtitleFilter = `subtitles=${subtitlePath}:force_style='` +
          `Fontname=${fontName},Fontsize=${fontSize},` +
          `PrimaryColour=${this.colorToASS(fontColor)},` +
          `BackColour=${this.colorToASS(backgroundColor)},` +
          `Alignment=${this.getAlignment(position)}'`;

        command = command.videoFilters(subtitleFilter);
      } else {
        // ソフトサブ(別トラックとして埋め込み)
        command = command
          .input(subtitlePath)
          .outputOptions([
            '-c:v copy',
            '-c:a copy',
            '-c:s mov_text',
            `-metadata:s:s:0 language=${language}`,
            '-disposition:s:0 default'
          ]);
      }

      command
        .output(outputPath)
        .on('end', () => resolve({ success: true }))
        .on('error', reject)
        .run();
    });
  }

  async extractSubtitles(videoPath, outputPath, options = {}) {
    const { format = 'srt', streamIndex = 0 } = options;

    return new Promise((resolve, reject) => {
      ffmpeg(videoPath)
        .outputOptions([
          `-map 0:s:${streamIndex}`,
          '-c:s text'
        ])
        .output(outputPath)
        .on('end', () => resolve({ success: true }))
        .on('error', reject)
        .run();
    });
  }

  async convertSubtitleFormat(inputPath, outputPath, fromFormat, toFormat) {
    const converters = {
      'srt_to_vtt': (content) => {
        return 'WEBVTT\n\n' + content
          .replace(/\r\n/g, '\n')
          .replace(/(\d+)\n(\d{2}:\d{2}:\d{2}),(\d{3})/g, '$1\n$2.$3');
      },
      'vtt_to_srt': (content) => {
        return content
          .replace(/WEBVTT\n\n/, '')
          .replace(/(\d{2}:\d{2}:\d{2})\.(\d{3})/g, '$1,$2');
      },
      'ass_to_srt': (content) => {
        // ASS/SSA to SRT変換
        const lines = content.split('\n');
        const events = lines.filter(line => line.startsWith('Dialogue:'));
        let srtContent = '';
        let counter = 1;

        for (const event of events) {
          const parts = event.split(',');
          const start = this.assTimeToSrt(parts[1]);
          const end = this.assTimeToSrt(parts[2]);
          const text = parts.slice(9).join(',').replace(/{[^}]+}/g, '');

          srtContent += `${counter}\n${start} --> ${end}\n${text}\n\n`;
          counter++;
        }

        return srtContent;
      }
    };

    const converterKey = `${fromFormat}_to_${toFormat}`;
    const converter = converters[converterKey];

    if (!converter) {
      throw new Error(`Conversion from ${fromFormat} to ${toFormat} not supported`);
    }

    const inputContent = await fs.readFile(inputPath, 'utf8');
    const outputContent = converter(inputContent);
    await fs.writeFile(outputPath, outputContent);

    return { success: true };
  }

  colorToASS(color) {
    // カラーをASS形式に変換
    if (color.startsWith('#')) {
      const hex = color.slice(1);
      const r = parseInt(hex.slice(0, 2), 16);
      const g = parseInt(hex.slice(2, 4), 16);
      const b = parseInt(hex.slice(4, 6), 16);
      return `&H00${b.toString(16)}${g.toString(16)}${r.toString(16)}`;
    }
    return '&H00FFFFFF';  // デフォルト白
  }

  getAlignment(position) {
    const alignmentMap = {
      'bottom': 2,
      'middle': 5,
      'top': 8
    };
    return alignmentMap[position] || 2;
  }
}

第3章:AI駆動の動画処理

3.1 動画品質向上

AIベースのアップスケーリング

class AIVideoEnhancer {
  async upscaleVideo(inputPath, outputPath, options = {}) {
    const {
      scale = 2,  // 2x, 4x
      model = 'esrgan',  // esrgan, waifu2x, real-esrgan
      denoise = true,
      deblock = true,
      interpolation = 'lanczos'
    } = options;

    // フレーム抽出
    const framesDir = await this.extractFrames(inputPath);

    // 各フレームをAIで処理
    const enhancedFrames = await this.processFrames(framesDir, {
      scale,
      model,
      denoise,
      deblock
    });

    // フレームを動画に再構築
    await this.reconstructVideo(enhancedFrames, outputPath, {
      fps: await this.getOriginalFPS(inputPath),
      codec: options.codec || 'libx264'
    });

    // クリーンアップ
    await this.cleanup([framesDir, enhancedFrames]);

    return { success: true, output: outputPath };
  }

  async extractFrames(videoPath) {
    const outputDir = path.join(os.tmpdir(), `frames_${Date.now()}`);
    await fs.mkdir(outputDir, { recursive: true });

    return new Promise((resolve, reject) => {
      ffmpeg(videoPath)
        .outputOptions(['-vf', 'fps=fps'])
        .output(path.join(outputDir, 'frame_%06d.png'))
        .on('end', () => resolve(outputDir))
        .on('error', reject)
        .run();
    });
  }

  async processFrames(framesDir, options) {
    const frames = await fs.readdir(framesDir);
    const outputDir = path.join(os.tmpdir(), `enhanced_${Date.now()}`);
    await fs.mkdir(outputDir, { recursive: true });

    // バッチ処理で効率化
    const batchSize = 10;
    for (let i = 0; i < frames.length; i += batchSize) {
      const batch = frames.slice(i, i + batchSize);
      await Promise.all(batch.map(async (frame) => {
        const inputPath = path.join(framesDir, frame);
        const outputPath = path.join(outputDir, frame);
        await this.enhanceFrame(inputPath, outputPath, options);
      }));
    }

    return outputDir;
  }

  async enhanceFrame(inputPath, outputPath, options) {
    // ESRGANモデルを使用した例(実際はTensorFlow.jsやONNX Runtime)
    const tf = require('@tensorflow/tfjs-node');
    const model = await this.loadModel(options.model);

    // 画像読み込みと前処理
    const inputTensor = await this.preprocessImage(inputPath);

    // AI推論
    const outputTensor = await model.predict(inputTensor);

    // 後処理と保存
    await this.saveEnhancedImage(outputTensor, outputPath);

    // メモリクリーンアップ
    inputTensor.dispose();
    outputTensor.dispose();
  }

  // フレーム補間(モーション補間)
  async interpolateFrames(videoPath, outputPath, targetFPS) {
    const originalFPS = await this.getOriginalFPS(videoPath);
    const interpolationFactor = targetFPS / originalFPS;

    return new Promise((resolve, reject) => {
      ffmpeg(videoPath)
        .videoFilters([
          `minterpolate=fps=${targetFPS}:mi_mode=mci:mc_mode=aobmc:vsbmc=1`
        ])
        .output(outputPath)
        .on('end', () => resolve({ success: true }))
        .on('error', reject)
        .run();
    });
  }
}

3.2 自動シーン検出と編集

AIによるシーン分析

class SceneAnalyzer {
  async detectScenes(videoPath, options = {}) {
    const {
      threshold = 0.3,  // シーン変更しきい値
      minSceneDuration = 1.0  // 最小シーン長(秒)
    } = options;

    const scenes = [];

    return new Promise((resolve, reject) => {
      ffmpeg(videoPath)
        .outputOptions([
          '-filter_complex',
          `select='gt(scene,${threshold})',metadata=print:file=-`,
          '-f', 'null'
        ])
        .output('-')
        .on('stderr', (stderrLine) => {
          // シーン変更を検出
          const match = stderrLine.match(/pts_time:(\d+\.\d+)/);
          if (match) {
            const time = parseFloat(match[1]);
            if (scenes.length === 0 ||
                time - scenes[scenes.length - 1] > minSceneDuration) {
              scenes.push(time);
            }
          }
        })
        .on('end', () => resolve(scenes))
        .on('error', reject)
        .run();
    });
  }

  async generateHighlights(videoPath, outputPath, options = {}) {
    const {
      duration = 60,  // ハイライト動画の長さ(秒)
      detectFaces = true,
      detectAction = true,
      detectAudio = true
    } = options;

    // シーン検出
    const scenes = await this.detectScenes(videoPath);

    // 各シーンのスコアリング
    const scoredScenes = await this.scoreScenes(videoPath, scenes, {
      detectFaces,
      detectAction,
      detectAudio
    });

    // 上位シーンを選択
    const selectedScenes = this.selectTopScenes(scoredScenes, duration);

    // ハイライト動画生成
    await this.concatenateScenes(videoPath, selectedScenes, outputPath);

    return { success: true, output: outputPath };
  }

  async scoreScenes(videoPath, scenes, criteria) {
    const scores = [];

    for (let i = 0; i < scenes.length - 1; i++) {
      const start = scenes[i];
      const end = scenes[i + 1];
      const duration = end - start;

      let score = 0;

      // 動きの量を検出
      if (criteria.detectAction) {
        const motion = await this.analyzeMotion(videoPath, start, end);
        score += motion * 0.3;
      }

      // 音声レベル分析
      if (criteria.detectAudio) {
        const audioLevel = await this.analyzeAudio(videoPath, start, end);
        score += audioLevel * 0.3;
      }

      // 顔認識
      if (criteria.detectFaces) {
        const faceCount = await this.detectFacesInScene(videoPath, start, end);
        score += Math.min(faceCount / 3, 1) * 0.4;
      }

      scores.push({
        start,
        end,
        duration,
        score
      });
    }

    return scores;
  }
}

第4章:パフォーマンス最適化

4.1 並列処理とGPU活用

マルチスレッド処理の実装

const cluster = require('cluster');
const os = require('os');

class ParallelVideoProcessor {
  constructor() {
    this.workerCount = os.cpus().length;
  }

  async processVideosBatch(videos, processingFunc) {
    if (cluster.isMaster) {
      return this.masterProcess(videos, processingFunc);
    } else {
      return this.workerProcess(processingFunc);
    }
  }

  async masterProcess(videos, processingFunc) {
    const chunks = this.chunkArray(videos, this.workerCount);
    const results = [];

    // ワーカー生成
    for (let i = 0; i < this.workerCount; i++) {
      const worker = cluster.fork();

      worker.on('message', (msg) => {
        if (msg.type === 'result') {
          results.push(msg.data);
        }
      });

      // タスク割り当て
      worker.send({
        type: 'process',
        data: chunks[i]
      });
    }

    return new Promise((resolve) => {
      cluster.on('exit', (worker, code, signal) => {
        if (Object.keys(cluster.workers).length === 0) {
          resolve(results);
        }
      });
    });
  }

  async workerProcess(processingFunc) {
    process.on('message', async (msg) => {
      if (msg.type === 'process') {
        const results = [];

        for (const video of msg.data) {
          try {
            const result = await processingFunc(video);
            results.push(result);
          } catch (error) {
            results.push({ error: error.message });
          }
        }

        process.send({
          type: 'result',
          data: results
        });

        process.exit(0);
      }
    });
  }

  // GPU処理の検出と活用
  async detectGPU() {
    const gpuInfo = {
      nvidia: await this.checkNvidia(),
      intel: await this.checkIntelGPU(),
      amd: await this.checkAMDGPU()
    };

    return Object.entries(gpuInfo)
      .filter(([_, available]) => available)
      .map(([vendor]) => vendor);
  }

  async checkNvidia() {
    try {
      const { execSync } = require('child_process');
      execSync('nvidia-smi');
      return true;
    } catch {
      return false;
    }
  }

  getGPUEncoder(vendor, codec) {
    const encoders = {
      nvidia: {
        'H.264': 'h264_nvenc',
        'H.265': 'hevc_nvenc',
        'AV1': 'av1_nvenc'
      },
      intel: {
        'H.264': 'h264_qsv',
        'H.265': 'hevc_qsv',
        'AV1': 'av1_qsv'
      },
      amd: {
        'H.264': 'h264_amf',
        'H.265': 'hevc_amf'
      }
    };

    return encoders[vendor]?.[codec] || null;
  }

  chunkArray(array, size) {
    const chunks = [];
    for (let i = 0; i < array.length; i += size) {
      chunks.push(array.slice(i, i + size));
    }
    return chunks;
  }
}

第5章:トラブルシューティングと品質保証

5.1 一般的な問題と解決策

動画変換の問題診断

class VideoTroubleshooter {
  async diagnoseVideo(videoPath) {
    const issues = [];

    // メタデータ取得
    const metadata = await this.getVideoMetadata(videoPath);

    // コーデック互換性チェック
    if (this.isUnsupportedCodec(metadata.codec)) {
      issues.push({
        type: 'codec',
        severity: 'high',
        message: `Unsupported codec: ${metadata.codec}`,
        solution: 'Re-encode with H.264 or H.265'
      });
    }

    // 破損チェック
    const corruption = await this.checkCorruption(videoPath);
    if (corruption.detected) {
      issues.push({
        type: 'corruption',
        severity: 'critical',
        message: corruption.details,
        solution: 'Try to repair with ffmpeg -err_detect ignore_err'
      });
    }

    // 音声同期チェック
    const syncIssue = await this.checkAudioVideoSync(videoPath);
    if (Math.abs(syncIssue.offset) > 100) {  // 100ms以上のずれ
      issues.push({
        type: 'sync',
        severity: 'medium',
        message: `A/V sync offset: ${syncIssue.offset}ms`,
        solution: `Use -itsoffset ${-syncIssue.offset/1000} option`
      });
    }

    // ビットレート異常
    if (metadata.bitrate > 100000) {  // 100Mbps以上
      issues.push({
        type: 'bitrate',
        severity: 'low',
        message: 'Extremely high bitrate detected',
        solution: 'Consider re-encoding with lower bitrate'
      });
    }

    return issues;
  }

  async repairVideo(videoPath, outputPath, issues) {
    let ffmpegOptions = [];

    for (const issue of issues) {
      switch (issue.type) {
        case 'corruption':
          ffmpegOptions.push('-err_detect', 'ignore_err');
          ffmpegOptions.push('-fflags', '+genpts+igndts');
          break;

        case 'sync':
          const offset = parseFloat(issue.solution.match(/-?\d+\.\d+/)[0]);
          ffmpegOptions.push('-itsoffset', offset);
          break;

        case 'codec':
          ffmpegOptions.push('-c:v', 'libx264');
          ffmpegOptions.push('-preset', 'medium');
          ffmpegOptions.push('-crf', '23');
          break;
      }
    }

    return new Promise((resolve, reject) => {
      const command = ffmpeg(videoPath);

      if (ffmpegOptions.length > 0) {
        command.outputOptions(ffmpegOptions);
      }

      command
        .output(outputPath)
        .on('end', () => resolve({ success: true }))
        .on('error', reject)
        .run();
    });
  }

  async checkAudioVideoSync(videoPath) {
    // FFmpegを使って同期をチェック
    return new Promise((resolve, reject) => {
      let syncData = '';

      ffmpeg(videoPath)
        .outputOptions([
          '-af', 'ashowinfo',
          '-vf', 'showinfo',
          '-f', 'null'
        ])
        .output('-')
        .on('stderr', (stderrLine) => {
          syncData += stderrLine;
        })
        .on('end', () => {
          // タイムスタンプを解析して同期を計算
          const offset = this.calculateSyncOffset(syncData);
          resolve({ offset });
        })
        .on('error', reject)
        .run();
    });
  }
}

セキュリティとプライバシー

すべての処理はブラウザ内で完結し、データは外部に送信されません。個人情報や機密データも安心してご利用いただけます。

トラブルシューティング

よくある問題

  • 動作しない場合: ブラウザのキャッシュをクリアして再読み込み
  • 処理が遅い場合: ファイルサイズを確認(推奨20MB以下)
  • 結果が異なる場合: 入力形式と設定を確認

問題が解決しない場合は、ブラウザを最新版に更新するか、別のブラウザをお試しください。

まとめ:次世代動画変換技術の活用

動画変換技術は、コーデックの進化とAI技術の融合により、新たな段階に入っています。以下のポイントを押さえることで、効果的な動画処理を実現できます:

  1. 適切なコーデック選択:用途と対象デバイスに応じた最適化
  2. 品質とサイズのバランス:CRFやビットレートの適切な設定
  3. ストリーミング対応:HLS/DASHによるアダプティブ配信
  4. AI活用:品質向上とコンテンツ分析の自動化
  5. パフォーマンス最適化:GPU活用と並列処理

i4uの動画変換ツールを活用することで、簡単に高品質な動画変換を実行できます。

カテゴリ別ツール

他のツールもご覧ください:

関連ツール