Train-segformer-segmentation-on-custom-data.ipynb)error

from transformers import SegformerImageProcessor

feature_extractor = SegformerImageProcessor.from_pretrained(“nvidia/segformer-b0-finetuned-ade-512-512”)
feature_extractor.do_reduce_labels = False
feature_extractor.size = 128

train_dataset = SemanticSegmentationDataset(f"{dataset.location}/train/“, feature_extractor)
val_dataset = SemanticSegmentationDataset(f”{dataset.location}/valid/“, feature_extractor)
test_dataset = SemanticSegmentationDataset(f”{dataset.location}/test/", feature_extractor)

batch_size = 8
num_workers = 2
train_dataloader = DataLoader(train_dataset, batch_size=batch_size, shuffle=True, num_workers=num_workers)
val_dataloader = DataLoader(val_dataset, batch_size=batch_size, num_workers=num_workers)
test_dataloader = DataLoader(test_dataset, batch_size=batch_size, num_workers=num_workers)

segformer_finetuner = SegformerFinetuner(
train_dataset.id2label,
train_dataloader=train_dataloader,
val_dataloader=val_dataloader,
test_dataloader=test_dataloader,
metrics_interval=10,
)#The above is the code I wrote.

Downloading (…)rocessor_config.json: 100%
271/271 [00:00<00:00, 7.46kB/s]
The reduce_labels parameter is deprecated and will be removed in a future version. Please use do_reduce_labels instead.

FileNotFoundError Traceback (most recent call last)
in <cell line: 7>()
5 feature_extractor.size = 128
6
----> 7 train_dataset = SemanticSegmentationDataset(f"{dataset.location}/train/“, feature_extractor)
8 val_dataset = SemanticSegmentationDataset(f”{dataset.location}/valid/“, feature_extractor)
9 test_dataset = SemanticSegmentationDataset(f”{dataset.location}/test/", feature_extractor)

in init(self, root_dir, feature_extractor)
13
14 self.classes_csv_file = os.path.join(self.root_dir, “_classes.csv”)
—> 15 with open(self.classes_csv_file, ‘r’) as fid:
16 data = [l.split(‘,’) for i,l in enumerate(fid) if i !=0]
17 self.id2label = {x[0]:x[1] for x in data}

FileNotFoundError: [Errno 2] No such file or directory: ‘/content/Dog\u3000poop-1/train/_classes.csv’

This is the code provided by RobotFlow, and I don’t know how to handle this error.
I beg everyone for their help