Several videos were flagged as inappropriate by an automatic system designed to identify extremist content.
Groups monitoring the conflict in Syria say such videos document the war and could be used in future war crime prosecutions.
YouTube said removing the videos, which was often a decision taken by human reviewers, had been "the wrong call".
"We have a situation where a few videos get wrongly flagged and a whole channel is deleted," said Eliot Higgins, founder of citizen journalism website Bellingcat.
"For those of us trying to document the conflict in Syria, this is a huge problem."
Inappropriate content
Mr Higgins told the BBC that YouTube's machine-learning system had started flagging videos that had been on the platform for several years.
"Some channels have tens of thousands of videos. Retroactively pointing a system at old videos is a bigger issue than YouTube realises," he told the BBC.
YouTube said it was "continuing to improve" the tools reviewers used to identify inappropriate content.
The company said while it did not typically allow harmful content, it did make exceptions for educational, documentary and scientific videos.
It said human reviewers considered the context of footage uploaded, including the video title, tags and written description, as well as captions and descriptions within the video.
"When it's brought to our attention that a video or channel has been removed mistakenly, we act quickly to reinstate it," the company said in a statement.