Bacula File Daemon segfault and crash if BPIPE plugin is used
Summary
Reproducibility | Platform | OS | OS Version | Product Version |
---|---|---|---|---|
always | AMD64 | CentOS | 8 | 15.0.2 |
Description
Hello Everybody,
I noticed that each try to use BPIPE plugin in version Bacula 15.0
causes the file daemon segfault and crash.
It is caused in a function bpipe-fd.c:freePlugin()
where is a line:
bfree_and_null(p_ctx->cmd);
To be able use the bfree_and_null()
function, p_ctx->cmd
needs to be assigned with the smart memory allocator way. Otherwise if the SMARTALLOC
is enabled, it causes the segfault.
I prepared a patch that fixes it.
fix_bacula_fd_segfault_if_bpipe_plugin_is_used.patch
Also I am attaching the traceback file.
Steps to Reproduce
- Create a Fileset with the BPIPE plugin
Fileset {
Name = "TAR backup"
Include {
Plugin = "bpipe:/var/log/httpd.tar:tar -cf - /var/log/httpd 2>/dev/null:tar -C /tmp -xf -"
}
}
-
Run the job with that Fileset
-
After running a job with this Fileset at the end of the job the File Daemon throws segfault and finish working.
dmesg
[ 858.645137] bacula-fd[8460]: segfault at 7f873a1e38d0 ip 00007f873b101d8a sp 00007f8732a7ba18 error 4 in libc-2.28.so[7f873b068000+1bc000]
[ 858.645144] Code: f3 0f 1e fa 66 0f ef c0 66 0f ef c9 66 0f ef d2 66 0f ef db 48 89 f8 48 89 f9 48 81 e1 ff 0f 00 00 48 81 f9 cf 0f 00 00 77 66 <f3> 0f 6f 20 66 0f 74 e0 66 0f d7 d4 85 d2 74 04 0f bc c2 c3 48 83
- The backup job finishes correctly:
localhost-dir JobId 292: Start Backup JobId 292, Job=TAR_backup_1.2024-12-03_20.23.16_38
localhost-dir JobId 292: Connected to Storage "File1" at 10.0.0.134:9103 with TLS
localhost-dir JobId 292: Using Device "FileChgr1-Dev1" to write.
localhost-dir JobId 292: Connected to Client "localhost-fd" at localhost:9102 with TLS
localhost-fd JobId 292: Connected to Storage at 10.0.0.134:9103 with TLS
localhost-sd JobId 292: Volume "Vol-0011" previously written, moving to end of data.
localhost-sd JobId 292: Ready to append to end of Volume "Vol-0011" size=3,213,963,572
localhost-sd JobId 292: Elapsed time=00:00:01, Transfer rate=635.3 K Bytes/second
localhost-sd JobId 292: Sending spooled attrs to the Director. Despooling 400 bytes ...
localhost-dir JobId 292: Bacula localhost-dir 15.0.2 (21Mar24):
Build OS: x86_64-redhat-linux-gnu-bacula redhat Oncilla)
JobId: 292
Job: TAR_backup_1.2024-12-03_20.23.16_38
Backup Level: Full
Client: "localhost-fd" 15.0.2 (21Mar24) x86_64-redhat-linux-gnu-bacula,redhat,Oncilla)
FileSet: "TAR backup" 2024-12-03 20:23:16
Pool: "File" (From Command input)
Catalog: "MyCatalog" (From Client resource)
Storage: "File1" (From Command input)
Scheduled time: 03-gru-2024 20:23:16
Start time: 03-gru-2024 20:23:19
End time: 03-gru-2024 20:23:20
Elapsed time: 1 sec
Priority: 10
FD Files Written: 2
SD Files Written: 2
FD Bytes Written: 634,880 (634.8 KB)
SD Bytes Written: 635,346 (635.3 KB)
Rate: 634.9 KB/s
Software Compression: None
Comm Line Compression: 90.0% 10.0:1
Snapshot/VSS: no
Encryption: no
Accurate: no
Volume name(s): Vol-0011
Volume Session Id: 3
Volume Session Time: 1733274152
Last Volume Bytes: 3,214,600,010 (3.214 GB)
Non-fatal FD errors: 0
SD Errors: 0
FD termination status: OK
SD termination status: OK
Termination: Backup OK
localhost-dir JobId 292: Begin pruning Jobs older than 6 months .
localhost-dir JobId 292: No Jobs found to prune.
localhost-dir JobId 292: Begin pruning Files.
localhost-dir JobId 292: No Files found to prune.
localhost-dir JobId 292: End auto prune.
Thanks in advance for your help.
Best regards, Marcin