summaryrefslogtreecommitdiff
path: root/questionbank.typ
blob: 97bbc5f93b5d181470342bfe6c5b90126ba07ff5 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
#let title = [
  Unit 1: Introduction to cloud computing
]
#set text(12pt)
#set page(
  header: [ 
    #box()[
      Knowledge not shared, remains unknown.
    ]
    #h(1fr)
    #box()[#title]
  ],
  numbering: "1 of 1",
)
#align(center, text(20pt)[
  *#title*
])
#show table.cell.where(y: 0): strong
#outline()
#pagebreak()

= 1. Define cyber forensics with it's role and importance in investigating cyber crimes.
Cyber forensics is the application of investigative techniques to collect, analyze and present digital evidences for legal purposes. The key objectives are preserve evidence integrity, recover data and identify perpetrators.
\ Cyber forensics plays a crucial role in investigating cybercrimes by enabling investigators to recover deleted or hidden data, analyze system logs, trace malicious activities, and determine how an attack occurred. It helps identify the attacker by following digital footprints such as IP traces, malware signatures, and unauthorized access patterns. It also supports incident reconstruction, allowing investigators to understand the timeline and method of the crime.
\ The importance of cyber forensics lies in its ability to maintain a proper chain of custody and ensure that digital evidence remains authentic and untampered. It aids law enforcement and judicial authorities in prosecuting cybercriminals effectively. Furthermore, cyber forensics helps organizations strengthen their cybersecurity by revealing exploited vulnerabilities, preventing future attacks, and minimizing operational damage. Overall, cyber forensics is vital for uncovering truth in cyber incidents and ensuring justice in the digital world.
= 2. Explain digital forensics and discuss it's scope and relevance in modern cyber ecosystem.
Digital forensics is a specialized branch of forensic science that involves the systematic identification, acquisition, preservation, examination, and analysis of digital evidence from electronic devices such as computers, mobile phones, storage media, networks, and cloud environments. It follows scientifically proven procedures to ensure that the collected evidence remains authentic, reliable, and admissible in a court of law. The primary objective of digital forensics is to reconstruct events related to cyber incidents and support investigations involving cybercrimes, data breaches, fraud, unauthorized access, or misuse of digital systems.
\ The scope of digital forensics is broad and spans multiple domains. It includes computer forensics, focusing on desktops, laptops, and servers; mobile forensics, involving smartphones and handheld devices; network forensics, which examines network traffic and communication logs; cloud forensics, which deals with distributed cloud environments; and malware forensics, which analyzes malicious software to understand its behavior and source. Additionally, it covers digital evidence handling, data recovery, incident response, and forensic readiness for organizations.
\ Digital forensics holds significant relevance in the modern cyber ecosystem due to the rising dependence on digital technologies and the increasing number of cyberattacks. As cybercrimes become more sophisticated, digital forensics provides the tools and methodologies to trace attackers, uncover data manipulation, and support law enforcement in prosecuting offenders. It also plays a crucial role in organizational cybersecurity by helping detect intrusions, analyze vulnerabilities, and implement preventive measures. In an era dominated by cloud computing, IoT, mobile devices, and widespread digital transactions, digital forensics has become essential for maintaining trust, ensuring accountability, and protecting sensitive information across the digital landscape.
= 3. Types of cyber crimes with examples.
1. *Cyber-dependent crimes*
These are crimes that can only occur using computers, networks or the internet and have no offline version. These crimes directly target the confidentiality, integrity and availability of digital systems and require technical knowledge to execute.\
Examples include:
  - Hacking / Unauthorized access: Breaking into servers or databases to steal or alter information.
  - Malware attacks: Viruses, worms, trojans and ransomware used to damage systems or encrypt data for ransom.
  - DoS/DDoS attacks: Overloading a website or service to make it unavailable.
  - Botnet creation: Infecting multiple devices and controlling them remotely for attacks.
  - Cryptojacking: Secretly using a victim’s computer power to mine cryptocurrency.
2. *Cyber-enabled crimes*
These are traditional crimes that already existed but are expanded, accelerated or made easier using digital technology. These crimes use digital platforms to increase the scale, anonymity and speed of conventional criminal activities, making them harder to detect and control.\
Examples include:
  - Online fraud and phishing: Deceiving users through fake emails or websites to steal money or credentials.
  - Identity theft: Misusing stolen personal or financial information for illegal transactions.
  - Cyberbullying and harassment: Abusive behaviour carried out through social media or messaging apps.
  - Online child exploitation: Sharing illegal content or grooming minors via hidden forums.
  - Financial crimes: Money laundering or illegal transfers via digital wallets or cryptocurrencies.
  - Piracy and IP theft: Uploading or distributing pirated movies, music or software through torrent sites.
= 4. Classify cyber crimes and explain any three recent data breach cases.
1. *Cyber crimes against Individuals:*
These crimes target a person’s private, financial, health or identity information.
A recent example is the widespread phishing-based Aadhaar and PAN data theft incidents in India (2023-2024), where attackers used fake KYC update messages and links to steal citizens’ identity details, bank information, and OTPs. This led to large-scale financial fraud, identity misuse, and unauthorized digital transactions. The attacks specifically targeted individual users, exploiting their trust and lack of awareness.
2. *Cyber crimes against Organisations:*
These crimes are aimed at companies and institutions to steal customer data, disrupt operations, or demand ransom.
A major recent example is the Capita data breach (2023), where hackers infiltrated the UK-based outsourcing firm’s systems and accessed data belonging to millions of people, including confidential corporate information and special-category personal data. Although individuals were affected indirectly, the primary target was the organisation, which faced operational disruption, financial penalties, and loss of trust.
3. *Cyber crimes against Government:*
These target government agencies, national databases, or critical infrastructure, often putting entire populations at risk.
A notable recent example is the Bangladesh Government Birth & Death Registration breach (2023), where weaknesses in a government website exposed personal information of more than 50 million citizens. This incident highlighted how government systems, if poorly secured, can cause wide-scale identity threats, compromise national records, and undermine citizen trust in public digital services.
= 5. Write a short note on cyber forensics model and phases involved. 
A cyber forensics model is a structured framework that guides investigators in handling digital evidence in a scientific and legally acceptable manner. It ensures that every step of the investigation is performed systematically so that the evidence remains authentic, reliable, and admissible in court. The model helps forensic experts understand what happened during a cyber incident, how it happened, and who was involved.
1. *Identification:*
In this phase, investigators determine that an incident has occurred and identify the potential sources of digital evidence. This includes recognising compromised systems, logs, storage devices, network traces, or suspicious activities. Proper identification ensures that no critical evidence is overlooked.
2. *Preservation:*
Once the evidence is identified, it must be protected from alteration, damage, or loss. This involves isolating affected systems, creating bit-by-bit forensic images, maintaining a strict chain of custody, and preventing any changes to original data. Preservation ensures the integrity and authenticity of evidence for legal use.
3. *Collection:*
This phase involves systematically acquiring the digital evidence using forensic tools and procedures. Data such as logs, files, memory dumps, metadata, and network captures is collected in a controlled manner. The goal is to gather all relevant information without tampering with the original source.
4. *Analysis:*
Collected evidence is examined to reconstruct events, identify attack vectors, trace user activities, detect malware, or recover deleted data. Investigators correlate timelines, extract patterns, and interpret the technical findings to determine what happened, how it happened, and who was responsible.
5. *Reporting:*
The final phase involves documenting all observations, methods used, tools applied, and conclusions reached. The report must be clear, accurate, and legally admissible, presenting the evidence in a form understandable to non-technical stakeholders such as lawyers, judges, or organisational authorities. It may also include recommendations for prevention.
= 6. Explain the concept of digital evidence and write the steps for evidence lifecycle from collection to management.
Digital evidence refers to any information stored or transmitted in digital form that can be used in legal or investigative processes. It is crucial in modern cases involving cybercrime, financial fraud, identity theft, intellectual property violation, and computer misuse. Digital evidence must be admissible, authentic, and maintain integrity. This requires proper procedures for collection, preservation, and maintaining chain of custody to prove that the evidence has not been tampered with.
Characteristics of Digital Evidence:
- *Intangible*: Exists electronically and requires specialised tools to view or extract.
- *Volatile*: Easily altered, deleted, or overwritten if not handled properly.
- *Replicable*: Can be copied without affecting the original.
- *Rich in Metadata*: Contains timestamps, user details, and access history useful during investigation.
Types of Digital Evidence:
- *Computer-based evidence*: files, documents, system logs, metadata, and user activity records.
- *Network-based evidence*: packet captures, firewall/router logs, IDS/IPS logs, proxy logs, and email headers.
- *Mobile-based evidence*: call logs, SMS/MMS, app data, GPS records, browser history, photos, and videos.
The evidence lifecycle ensures that digital evidence is handled securely and remains legally acceptable throughout the investigation.
1. *Collection*: Investigators gather evidence from computers, networks, mobile devices, or cloud sources using authorised forensic tools. Care is taken to avoid altering original data.
2. *Preservation*: Evidence is secured in its original state by making forensic images, isolating devices, sealing storage media, and maintaining a clear chain of custody. This protects the integrity of the evidence.
3. *Examination*: Preserved data is examined to identify relevant information using techniques like file system review, keyword searches, timeline reconstruction, and log analysis.
4. *Analysis*: The extracted information is studied in depth to understand how the incident occurred, what actions were taken, and who was involved. Investigators correlate logs, recover deleted data, trace user activity, and detect malware behaviour.
5. *Documentation and Reporting*: All steps and findings are recorded in a structured forensic report that explains methods used, evidence collected, timelines, and conclusions. This ensures clarity for legal or organisational authorities.
6. *Storage and Management*: After the case, evidence is securely stored with proper access control and archival procedures. It may be retained for future legal proceedings or audits.
= 7. Write ethical, professional and legal issues for cyber forensic investigations.
Cyber forensic investigations involve the handling of sensitive digital evidence, and therefore investigators must follow strict ethical, professional, and legal standards. Any deviation can compromise the investigation, violate privacy, or make the evidence inadmissible in court.
== Ethical Issues:
Ethical concerns arise because digital evidence often contains personal, confidential, or sensitive information.
- *Privacy Violations*: Investigators may unintentionally access private files or data unrelated to the case, raising ethical dilemmas about confidentiality.
- *Misuse of Access*: Forensic tools provide deep system access, so unethical behaviour such as leaking information, snooping, or unauthorised data copying must be avoided.
- *Bias and Objectivity*: Investigators must remain neutral and avoid altering, exaggerating, or selectively presenting evidence to favour one party.
- *Data Minimisation*: Only relevant data should be examined; unnecessary exposure of personal content may be ethically inappropriate.
== Professional Issues:
Professionalism ensures that forensic work maintains accuracy, integrity, and trust.
- *Competence and Skill*: Investigators must have proper training, certifications, and technical knowledge. Incompetent handling can corrupt evidence.
- *Use of Standard Procedures*: Following recognised forensic models, maintaining chain of custody, and documenting every step are essential professional responsibilities.
- *Tool Reliability*: Forensic tools must be validated and updated; using outdated or unreliable tools can give inaccurate results.
- *Maintaining Confidentiality*: Professionals must protect all case-related information from unauthorised disclosure.
== Legal Issues:
Legal issues directly affect whether evidence is accepted in court and whether the investigation itself is lawful.
- *Search and Seizure Laws*: Investigators must follow proper legal authority before accessing devices or networks; unauthorised search makes evidence inadmissible.
- *Chain of Custody Requirements*: Every transfer of evidence must be documented to prove it was not tampered with.
- *Admissibility Rules*: Evidence must meet standards of authenticity, integrity, and relevance. Improper handling or contamination may lead to rejection.
- *Jurisdictional Challenges*: Cybercrimes often cross national borders, creating legal conflicts regarding which country’s laws apply.
- *Data Protection Laws*: Investigators must comply with privacy and data protection regulations such as IT Act rules, GDPR, or local digital privacy laws.
= 8. Write a note on ethical dilemmas and professional responsibilities of forensic expert in practice.
1. *Ethical Dilemmas:*
  - *Privacy vs. Investigation Needs*: Digital evidence may contain private or unrelated personal data. The expert must decide how much information to access without violating privacy.
  - *Bias and Objectivity*: Pressure from employers, clients, or law enforcement may influence findings. The expert must avoid bias and maintain neutrality even if results go against expected outcomes.
  - *Handling Sensitive or Confidential Data*: Access to emails, chats, medical records, or financial information can tempt misuse. The dilemma arises between what is legally required and what is ethically appropriate to view.
  - *Reporting Unfavourable Findings*: When evidence contradicts the client’s interests, experts may struggle between honesty and professional pressure.
  - *Scope Creep*: Investigators may discover evidence unrelated to the case. Deciding whether to report or ignore such findings can create ethical conflicts.
2. *Professional Responsibilities:*
  - *Objectivity and Impartiality*: Evidence must be examined without personal bias. Conclusions should be based solely on facts and scientific methods.
  - *Competence and Skill*: The expert must possess up-to-date technical knowledge, use validated tools, and continuously improve skills to avoid errors.
  - *Following Standard Procedures*: This includes proper evidence handling, chain-of-custody maintenance, forensic imaging, documentation, and compliance with investigation protocols.
  - *Confidentiality*: Sensitive case information must be protected from unauthorised access or disclosure. Experts must respect the privacy of individuals involved.
  - *Accurate Documentation and Testimony*: Reports must be clear, truthful, and technically correct. If called to court, the expert must present evidence honestly and explain methods transparently.
  - *Integrity and Professional Conduct*: Forensic experts must avoid actions that could mislead the court, such as altering evidence, overstating expertise, or giving false testimony.
= 9. Differentiate between FAT32 and NTFS
#table(
  columns: (auto, auto, auto),
  table.header([Feature], [FAT32], [NTFS]),
  [Maximum file size], [Supports files upto 4GB], [Supports files larger than 4GB],
  [Maximum partition size], [2TB], [16TB+],
  [Security Features], [Very basic, no file level permissions, no encryption], [Supports file/folder permissions, access control lists and encryption],
  [Reliability], [Less reliable, prone to fragmentation and corruption], [More reliable, supports journaling and self recovery],
  [Performance], [Faster on small drives or simple devices], [Better performance on large volumes or modern systems],
)
= 10.Working and structure of FAT and NTFS
1. *Working of FAT (File Allocation Table)*
  FAT is a simple file system used mainly in USB drives, memory cards, and older Windows systems. Its working is based on a table used to keep track of where file data is stored. The following are the steps used by FAT:
  1. Disk is divided into clusters
    -	FAT breaks the disk into small equal units called clusters.
    -	Every file is stored by occupying one or more clusters.
  2. FAT Table is created at the beginning of the drive
    - This table contains an entry for every cluster on the disk.
  3. A file is saved in multiple clusters
    - If a file is large, it will be stored in many scattered clusters.
    - File A is stored in clusters: 5 -> 6 -> 8
  4. FAT Table stores pointers to the next cluster
    - FAT entry looks like:
#table(
  columns: (auto, auto),
  table.header([Cluster Number], [ FAT Entry Meaning  ]),
  [5], [Next cluster is 6],
  [6], [Next cluster is 8],
  [8], [EOF]
)
  5. When a file is opened, the following things happen:
    1. The OS looks at the starting cluster of the file.
    2. It checks the FAT table to locate the next cluster.
    3. It continues following the chain from cluster to cluster.
    4. When it reaches EOF, the file is completly read.
#figure(
  image("assets/fatstructure.png", width: 100%),
)
2. *Working of NTFS (New Technology File System)*
  NTFS is used in modern Windows systems. It is more advanced than FAT and works like a database. Its core structure is the Master File Table (MFT).
  NTFS Working – Step-by-Step Explanation
  1. NTFS creates the Master File Table (MFT)
    - MFT contains an entry for every file and folder on the disk.
    - Each MFT entry is 1 KB or 2 KB and stores all file details.
  2. Each file is stored as metadata (attributes)
    - A file’s MFT entry contains:
      1. File name
      2. Created/modified dates
      3. Security permissions (ACL)
      4. Size and properties
      5. Data or data pointers
  3. Small files stored inside MFT (Resident data)
    - If a file is very small (like 1 KB),
    - It is stored directly inside the MFT entry.
    - This makes NTFS very fast.
  4. Large files are stored in clusters
    - When a file is big:
      1. MFT entry stores pointers (addresses) to clusters.
      2. These clusters contain the actual data.
    - Unlike FAT, these pointers are not a chain but organized as extents, making NTFS faster.
  5. NTFS uses journaling
    - Whenever you modify a file:
      1. NTFS writes the changes to a log file (\$LogFile).
      2. If the system crashes, NTFS uses this journal to recover.
    - This makes NTFS more reliable.
#figure(
  image("assets/ntfs_structure.png", width: 100%),
)
*Key differences*
#table(
  columns: (auto, auto, auto),
  table.header([Feature], [FAT Working], [NTFS Working]),
  [Data structure], [Linked list chain of clusters], [Database-like MFT entries],
  [File lookup], [Slow, must follow chain], [Fast, direct lookup in MFT],
  [Reliability], [Poor, no journaling], [High, uses journaling],
  [Small files], [Normal clusters], [Can be stored in MFT],
  [Large files], [Fragment easily], [Managed as extents, less fragmentation]
)
= 11. Importance of boot sector and windows registry
== Importance of the Boot Sector
  The boot sector is the first sector of a storage device and contains essential information required to start the operating system. It plays a foundational role in the system startup process.
  1. *Contains Boot Loader Code*\
    The boot sector stores the initial boot program (MBR or VBR). When a computer starts, the BIOS/UEFI loads this code into memory to begin the operating system boot process. If this sector is corrupted, the system will fail to boot.
  2. *Stores File System Information*\
    - It contains details such as:
      1. File system type (FAT32, NTFS)
      2. Cluster and sector size
      3. Location of important file system structures
    - This information helps the OS understand how data is organized on the disk.
  3. *First Point of Access for Disk Operations*\
    The OS uses the boot sector to locate and access other critical disk structures. Without this, the system cannot load system files.
  4. *Essential for System Recovery*\
    Recovery tools use the boot sector to rebuild damaged disks or restore partitions. Any corruption in this sector may render the disk unreadable.
  5. *Important in Cyber Forensics*\
    Forensic investigators examine the boot sector to detect:
      - Partition tampering
      - Hidden volumes
      - Boot sector malware
    It helps reconstruct the disk's structure and understand system startup behavior.

== Importance of the Windows Registry
  The Windows Registry is a centralized database that stores configuration data for the operating system, hardware, users, and installed applications. It is crucial for the functioning and stability of a Windows system.
  1. *Central Configuration System*\
    All critical OS settings, such as system parameters, installed programs, driver information, and user preferences, are stored in the registry. This allows Windows to operate consistently and efficiently.
  2. *Controls System Startup*\
    Registry keys store:
      - Startup applications
      - System services to be loaded
      - Boot configuration settings
    Any change to these keys directly affects how Windows starts and operates.
  3. *Stores Hardware Configuration*\
    The registry contains information about all connected hardware devices. Windows uses these entries to detect, configure, and manage devices such as printers, network adapters, and storage drives.
  4. *Application Management*\
    Applications store settings, license data, paths, and user preferences in the registry. This enables software to function smoothly and retain its configurations across sessions.
  5. *Critical for Troubleshooting*\
    Registry entries help diagnose:
      - Startup problems
      - Misconfigured applications
      - Driver issues
    Editing or restoring registry entries is a common method for fixing system errors.
  6. *High Forensic Value*\
    Forensic investigators use registry artifacts to identify:
      - User login activity
      - Recently opened files
      - Connected USB devices
      - Installed or removed software
      - Network and system configurations
    The registry is a key source of digital evidence.
  7. *Target for Malware*\
    Many malware programs modify registry keys to persist at startup, hide their presence, or disable security tools. This makes registry examination an essential part of incident response.
= 12. Importance of understanding file system in identifying cyber crimes and forensic tracing.
Understanding file systems is a fundamental requirement in digital forensics because every action performed on a computer-saving a file, installing software, deleting data, or running a program-interacts with the file system. Cyber criminals often attempt to hide, modify, or destroy evidence, and only a deep understanding of the file system enables forensic investigators to uncover these traces.
1. *Recovering Deleted or Hidden Data*\
  File systems such as FAT32, NTFS, ext4, and APFS manage how files are stored, referenced, and deleted. Even when criminals delete files:
    - File entries may remain in the file table.
    - Data may still exist in clusters, slack space, or unallocated space.
    - Metadata such as timestamps may remain intact.
  Knowing file system behaviors helps investigators retrieve deleted logs, documents, malware, or communication records.
2. *Tracing User Activities Through Metadata*\
  File systems store extensive metadata about files, including:
    - File creation and modification timestamps
    - Ownership and access permissions
    - File size and structure
    - Alternate data streams (NTFS ADS)
  This metadata helps reconstruct user actions, identify tampering, and build a timeline of events critical for cybercrime cases.
3. *Identifying File Tampering and Anti-Forensic Techniques*\
  Cyber criminals may use anti-forensic methods such as:
    - Timestamp manipulation
    - Alternate data streams to hide malicious code
    - File signature spoofing
    - Data obfuscation in slack/unallocated space
  Understanding the file system architecture exposes such anomalies and reveals concealed evidence.
4. *Detecting Malware and Unauthorized Access*\
  Malware often interacts with file systems by:
    - Creating hidden files
    - Modifying system files
    - Injecting code into unused sectors or ADS
    - Storing payloads in unallocated space
  Analysis of file system logs, directory structures, and allocation patterns helps investigators detect malware footprints.
5. *Reconstruction of System Events and User Behavior*\
  By analyzing file system artifacts, investigators can reconstruct sequences such as:
    - Program execution history
    - USB device usage
    - File transfers and downloads
    - Application installation logs
  This reconstruction is essential for insider threat cases, fraud, intellectual property theft, and cyber espionage.
6. *Identifying Unauthorized Data Exfiltration*\
  File system traces help detect:
    - Creation or deletion of suspicious archive files
    - Sudden changes in file sizes
    - Recently accessed files before exfiltration
    - Evidence of copying to external drives
  Such patterns are vital in data breach or corporate espionage investigations.
7. Validating Integrity and Authenticity of Evidence
  File systems support mechanisms that help validate evidence:
    - NTFS uses MFT entries and transaction logs
    - Journaling file systems maintain historical snapshots
    - Hashing ensures data integrity
  Forensic experts use these features to prove evidence has not been tampered with.
= 13. Explain data carving techniques and recovery of deleted graphic files with example scenarios.
  - Data carving is the act of searching for particular strings or bytes within a structure. A hex editor or other data-viewing tools can be used to carve for data. 
  - The analyst determines a string or binary pattern to search for, then initiates a search across a device or structure for that string or pattern. 
  - The target can be of whatever scope is appropriate for the task, such as a file, slackspace, unallocated space, a full volume, a memory image, or a swapfile.
  - The technique can be used to carve for full files–such as recovering deleted JPG image files; or for records–such as recovering portions of a deleted Windows event log.
== How data carving works
  - File Signatures: Every file type has a unique header and footer that can be identified by specialized software. These are called "magic numbers" or "file signatures." Data carving uses these signatures to locate the start and end of files within the raw data.
  - Search for Patterns: The process involves scanning the unallocated space or fragmented storage, looking for these specific signatures. Once identified, the tool "carves" out the data between the known start and end points of a file.
  - Reconstructing Files: After finding the file's start and end, the carving tool will attempt to reconstruct the file. Even if parts of the file are missing or fragmented, it may still be possible to recover significant portions of the file.
== Data Carving Techniques
  1. Header–Footer Carving:
    - This method searches for known file headers and footers that identify file boundaries. For example, a JPEG file typically starts with FFD8 and ends with FFD9 in hexadecimal. The tool extracts all data between these markers.
    - Scenario: Recovering images from a formatted memory card in which directory entries are lost, but JPEG signatures remain intact in unallocated sectors.
  2. Header-Only Carving (Fragmented Files):
    - If footers are missing, like in cases of fragmentation or partial overwrites, tools rely on the header alone and extract a fixed block size after it or use heuristic rules.
    - Scenario: During ransomware analysis, only the beginning of each encrypted JPEG remains. Carvers recover partial images using header-based rules.
  3. Content-Based or Structure-Aware Carving:
    - Advanced techniques recognise internal file structure to carve files more accurately.
    - Scenario: Recovering photos from a physically damaged SD card where only parts of the file structure remain readable.
  4. Fragment Carving / Reassembly:
    - Used when files are stored non-contiguously. Algorithms analyse byte patterns, entropy, and statistical similarities to reassemble fragmented files.
    - Scenario: On a large hard drive with heavy fragmentation, images are scattered across sectors. Tools like Scalpel or Foremost reassemble them.
  5. Semantic Carving (Content Validation):
    - After extracting data, the carved file is validated by checking if it opens correctly or displays expected structure.
    - Scenario: Carved images are verified to ensure they are not random data or corrupted fragments.
== Recovery of graphic files
Graphic files exist in different categories such as bitmap graphics, vector graphics, and metafile graphics, and understanding their structure is essential for recovering deleted images during forensic investigations.
1. *Bitmap graphics* consist of pixels arranged in a grid, where each pixel stores a specific color value. These files are also known as raster images and are arranged in rows to make printing easier. Bitmap files vary in quality based on resolution and color depth. Higher resolutions and greater color depth provide more detail. During recovery, forensic tools look for pixel structures, color patterns, and known file signatures to reconstruct deleted bitmap images.
2. *Vector graphics* store images using mathematical equations, making them scalable without loss of quality. Because they are not pixel-based, forensic recovery focuses on identifying vector instructions and structures rather than pixel grids.
3. *Metafile graphics* combine both raster and vector components. Forensics must handle mixed data-bitmap portions lose resolution when enlarged, while vector portions remain sharp.
4. Graphics files are created and edited using graphics editors like Microsoft Paint, Adobe Photoshop, GIMP, or Illustrator. These tools support various standard formats such as PNG, GIF, JPEG, TIFF, BMP and DXF, HPGL. Some formats are proprietary or obsolete, and forensic investigators may need special software to open them. Unknown files can be identified through online signature databases such as Gary Kessler’s file signature library or Webopedia.
- In recovering deleted graphic files, it is important to understand that format conversion may change metadata or reduce color information. Bitmap and raster images store color based on bits per pixel, and saving them in a format with lower color depth can degrade image quality.
// - Digital photographs, including RAW and EXIF formats, are especially significant in forensics. RAW files store unprocessed image data like a digital negative, offering the highest quality but often requiring manufacturer-specific tools for viewing. EXIF format stores metadata such as camera model, shutter speed, resolution, date, time, and GPS location. Tools such as Exif Reader, IrfanView, Autopsy, and Magnet Axiom extract metadata that helps investigators determine when and where a photo was taken and which device captured it.
= 14. Process of forensic imaging using industrial data acquisition tools.
Forensic imaging is the process of creating an exact bit-by-bit copy of digital storage media for investigation. Industrial data acquisition tools such as Cellebrite UFED, FTK Imager, EnCase Imager, X-Ways Forensics, Tableau hardware imagers, and write-blockers are used to ensure accuracy, integrity, and legal admissibility. The goal is to acquire data without altering the original source.
  1. *Preparation and Documentation*\
    The investigator begins by documenting the device details such as storage type, serial number, condition, and connection interfaces. Chain of custody is initiated. A secure forensic workstation is prepared with write-blockers and industrial imaging tools.
  2. *Device Isolation and Write-Blocking*\
    The source drive or mobile device is isolated to prevent any accidental write operations. Hardware write-blockers or software write-protection features ensure that no data is modified during acquisition. This guarantees that the original evidence remains untouched.
  3. *Selecting the Acquisition Method*\
    Depending on the device and investigation needs, the tool performs:
      - *Physical imaging:* bit-by-bit copy of entire disk including deleted space
      - *Logical imaging:* extracts active files and folders
      - *Targeted acquisition:* specific partitions or directories
      - *Industrial tools* automatically detect supported acquisition modes.
  4. *Using Industrial Imaging Tools*\
    Tools like FTK Imager or EnCase Imager allow the investigator to choose the destination path, image format, and hashing algorithms. The tool reads the source bit-by-bit and writes an exact forensic copy to the destination storage.
  5. *Hash Generation and Verification*\
    Before and after imaging, the tool generates hash values of both original media and the forensic image. Matching hashes confirm that the acquired image is an exact, unaltered replica of the original device.
  6. *Image Integrity Checking*\
    Industrial tools perform automated integrity checks, verify sectors, and log any read errors. Bad sectors or unreadable areas are documented to maintain transparency in the forensic process.
  7. *Documentation and Logging*\
    All actions taken by the imaging tool-start time, end time, hash values, errors, device details-are automatically recorded in a log file. This documentation is essential for legal presentation and court testimony.
  8. *Secure Storage of Forensic Image*\
    The acquired forensic image is stored in a secure repository with controlled access. Backup copies may be created for analysis while preserving the original image as primary evidence.
  9. *Preservation of Original Device*\
    After imaging, the source device is sealed, labelled, and preserved for future verification. Further investigation is performed only on the forensic image, not the original device.
= 15. Process of file reconstruction and recovery
File reconstruction and recovery is a key digital forensic process used to restore deleted, damaged, or fragmented files from storage media. When files are deleted, the file system usually removes only directory entries while the actual data remains on disk until overwritten. Forensic tools use low-level analysis, file signatures, metadata, and data patterns to rebuild files back to their usable form. The process generally involves the following steps:
  1. *Identifying File System and Storage Layout*\
    The investigator examines the file system to understand how files are stored, how deletion works, and where unallocated space or slack space resides. This helps determine whether files are contiguous or fragmented.
  2. *Scanning for File Signatures (Header/Footer Examination)*\
    File reconstruction begins by locating known file signatures. Many file types have identifiable headers and footers . Tools scan the entire disk surface-including unallocated space-for these patterns to identify deleted files.
  3. *Carving Data Blocks from Storage*\
    - Once signatures are found, the forensic tool extracts corresponding data blocks.
    - Header–footer carving is used when both boundaries exist.
    - Header-only carving is used when the file is fragmented or partially overwritten.
    - This step reconstructs the raw content of the file.
  4. *Reassembling Fragmented Files*\
    In many cases, files are not stored contiguously. Fragmentation requires the tool to identify and reorder fragments using:
      - File structure patterns
      - Byte similarity
      - Entropy analysis
      - Sector sequencing
      - Content continuity
  5. *Validating File Structure*\
    Recovered files are checked for internal consistency. For example:
      - JPEGs are validated by checking EXIF blocks and segment markers.
      - PNGs must have proper chunk structures.
      - Documents must match expected XML or binary layout.
      - Invalid or corrupted fragments are discarded.
  6. *Metadata Analysis*\
    Tools examine metadata associated with recovered files-timestamps, hash values, EXIF tags, file size, authorship-to confirm authenticity and origin. Metadata also helps link the file to user activities or devices.
  7. *Repairing Partially Damaged Files*\
    If the file is incomplete, tools attempt reconstruction by:
      - Rebuilding headers
      - Filling missing values
      - Repairing internal structure
      - Merging partial fragments
      - For images, analysts may recover partial pixels or thumbnails.
  8. *Hashing and Integrity Checking*\
    Recovered files are hashed to generate digital fingerprints. Hash values ensure that reconstructed content remains unaltered during subsequent analysis.
  9. *Documentation of Recovery Process*\
    All steps-methods used, tools applied, fragments found, errors, and final reconstruction results-are fully documented. This supports legal admissibility and forensic transparency.
  10. *Exporting and Storing Recovered Files*\
    The reconstructed files are saved in standard formats and stored securely. Original recovered chunks may be preserved for further analysis or verification.
= 16. Demonstrate how whole disk encryption affects forensic imaging and retrieval.
Whole Disk Encryption (WDE) encrypts an entire storage device-operating system files, user data, temporary files, and even unallocated space. Tools like BitLocker, VeraCrypt, FileVault, and LUKS transform all disk sectors into unreadable ciphertext. While this protects user privacy, it creates several challenges for forensic imaging and evidence retrieval.
  1. *Imaging Captures Only Encrypted Data*\
    - When a forensic analyst performs a sector-by-sector image of an encrypted disk, the resulting forensic image contains only encrypted ciphertext, not the actual readable files.
    - File names, folder structure, metadata, and content remain inaccessible.
    - Even deleted data is encrypted, preventing carving or recovery.
  2. *No Access Without Decryption Keys*\
    - Retrieving usable evidence requires:
      - User password/PIN
      - Recovery keys
      - TPM-protected keys
      - Keyfiles
    - Without these, the forensic image is practically useless because modern encryption cannot be brute-forced.
  3. *Live System Acquisition May Be Required*\
    - If the encrypted machine is powered on and unlocked, investigators often perform live acquisition, because encryption keys reside in RAM.
    - Tools extract keys from memory (cold boot attacks, RAM imaging).
    - Shutting down the system destroys keys, making data inaccessible.
  4. *Prevents Traditional Data Carving*\
    - Because all sectors are encrypted:
    - No recognizable headers.
    - No footers or signatures.
    - No usable patterns.
    - Thus, deleted files cannot be carved or reconstructed until the disk is decrypted.
  5. *Forensic Imaging Still Required for Integrity*\
    - Even though the data is encrypted, forensic imaging is still performed:
      - To preserve the exact encrypted state
      - To maintain chain of custody
      - To enable later decryption if keys are recovered
      - To document storage structure and encryption configuration
  6. *Challenges in Metadata Retrieval*\
    WDE encrypts:
      - File system metadata
      - Timestamps
      - Directory entries
      - Logs
    This prevents timeline analysis, EXIF extraction, access history checks, and file structure inspection until decryption occurs.
  7. *Hardware-Backed Encryption Adds Difficulty*\
    - TPM encryption:
    - Stores keys inside hardware
    - Relies on boot integrity measurements
    - Breaking this protection is extremely difficult without the correct authentication token.
  8. *Cloud-Synced Encrypted Devices Complicate Analysis*\
    Devices using WDE often sync data to cloud services. Encrypted disk contents may differ from cloud-stored versions, requiring separate legal access to cloud accounts for evidence collection.
= 17. Process of analyzing network traffic for forensic investigation.
Network traffic analysis is a crucial part of cyber forensic investigations used to detect malicious activity, reconstruct events, and identify attackers. It involves capturing, inspecting, and interpreting packets flowing through a network. The process generally follows these systematic steps:
  1. *Identification of Network Sources and Scope*\
    Investigators begin by identifying what needs to be monitored-servers, routers, firewalls, switches, or endpoints. They determine log sources and define the time window relevant to the incident.
  2. *Capturing Network Traffic*\
    Live packet capture is performed using tools like Wireshark, tcpdump, Tshark, or specialized forensic appliances. Investigators may also analyze previously saved packet capture (PCAP) files. Capture points must be strategically placed to ensure visibility of inbound and outbound traffic.
  3. *Ensuring Integrity of Captured Data*\
    Captured packets are saved in forensic formats. Hash values are generated to ensure the traffic data remains unaltered. Maintaining chain of custody prevents tampering claims.
  4. *Filtering and Session Reconstruction*\
    Raw traffic is large and noisy. Investigators filter packets based on:
      - Source/destination IP
      - Ports
      - Protocols
      - Time of communication
    They then reconstruct sessions such as HTTP requests, DNS queries, or TCP streams to understand how communication occurred.
  5. *Protocol and Payload Analysis*\
    Each protocol is inspected for anomalies or suspicious behaviour. Examples include:
      - Unusual port usage
      - DNS tunneling
      - Suspicious HTTP POST requests
      - Encrypted traffic with unexpected destinations
      - Payload inspection reveals evidence of malware downloads, command-and-control communication, or data exfiltration.
  6. *Identifying Indicators of Compromise (IOCs)*\
    Analysts look for:
      - Malicious IPs or domains
      - Blacklisted URLs
      - Known malware signatures
      - Strange traffic patterns (spikes, beaconing, repeated failed logins)
    These IOCs help trace back intrusions and identify compromised machines.
  7. *Timeline and Event Correlation*\
    Network events are correlated with system logs, firewall entries, authentication logs, and server events. This helps reconstruct the exact sequence of actions taken by an attacker, such as when they entered the network, what they accessed, and how data was transferred.
  8. *Detecting Data Exfiltration*\
    Investigators analyze outbound traffic for unusual uploads, encrypted payloads, or large data transfers to unknown destinations. Frequency analysis and bandwidth evaluation help detect covert exfiltration.
  9. *Analyzing Malware Communication Patterns*\
    If malware is suspected, analysts inspect:
      - Repeated periodic connections (beaconing)
      - Communication with foreign C2 servers
      - Encrypted or hidden channels (VPN, proxies, Tor)
    This provides insights into attacker behaviour and intent.
  10. *Reporting and Documentation*\
    After analysis, investigators prepare a detailed report describing tools used, packets analyzed, findings, IOCs, timelines, and conclusions. The report must be clear, legally admissible, and supported by PCAP evidence.
= 18. Write the methods and tools used for capturing and analyzing network traffic during forensic investigation.
Network traffic analysis is an essential part of digital forensics used to detect intrusions, trace attackers, and reconstruct malicious activities. The process relies on well-defined methods of capturing traffic and specialized tools for examining the captured data.\
The methods of Capturing and Analyzing Network Traffic are:
  1. *Packet Sniffing (Live Capture)*\
    - Involves capturing real-time network packets passing through an interface.
    - Used to observe ongoing attacks, suspicious connections, or data exfiltration.
    - Requires the network card to be in promiscuous mode.
  2. *Port Mirroring / SPAN (Switch Port Analyzer)*\
    - Traffic from one or more switch ports is mirrored to a monitoring port.
    - Allows investigators to view all traffic flowing through a switch segment.
    - Common in enterprise environments.
  3. *Network Tap*\
    - A dedicated hardware device placed between network links to copy traffic without interfering.
    - Provides clean, complete packet capture.
    - Ideal for forensic-grade monitoring.
  4. *Log Collection and Analysis*\
    - Captures network activity through logs rather than raw packets.
    - Firewall logs, IDS/IPS logs, router logs, proxy logs.
    - Useful for long-term investigations or when packet capture isn't possible.
  5. *Full Packet Capture (FPC)*\
    - Captures every packet, including headers and payloads.
    - Enables deep analysis and session reconstruction.
    - Storage-intensive but forensically powerful.
  6. *Flow-Based Monitoring*\
    - Captures metadata rather than full packets.
    - Useful for detecting scans, anomalies, and large-scale data transfers.
    - Less storage but limited detail.
  7. *Deep Packet Inspection*\
    - Analyzes packet content, protocols, and payloads.
    - Detects hidden channels, malware signatures, and encrypted tunnels.
    - Often used by IDS/IPS systems.
Tools used for network traffic capture and analysis are:
  1. *Wireshark*\
    - Industry-standard packet analysis tool.
    - Supports live capture and PCAP analysis.
    - Provides protocol decoding, filtering, and session reconstruction.
  2. *Tcpdump / Tshark*\
    - Command-line packet capture tools.
    - Useful for quick capture, automation, and remote forensic work.
    - Output stored as PCAP files for later analysis.
  3. *Network Miner*\
    - Forensic network analysis tool.
    - Extracts files, images, credentials, and metadata from PCAP captures.
    - Useful for reconstructing attacker activities.
  4. *Suricata / Snort*\
    - IDS/IPS systems used to detect malicious traffic.
    - Provide alerts, signatures, flow data, and logs useful in forensic analysis.
  5. *Zeek (formerly Bro)*\
    - Network security and monitoring framework.
    - Logs detailed behavior of network activities.
    - Excellent for timeline reconstruction and anomaly detection.
  6. *NetFlow and IPFIX Tools*\
    - Tools like SolarWinds NetFlow Analyzer, NfDump, Elastiflow parse flow records.
    - Help identify large transfers, scanning patterns, or lateral movement.
  7. *Fiddler / Burp Suite*\
    - Used to capture and analyze HTTP/HTTPS web traffic.
    - Useful in cases of web attacks, MITM analysis, and API inspection.
  8. *Hardware Taps and SPAN Tools*\
    - Devices like Garland TAPs, Dualcomm, NetScout are used for clean packet acquisition.
    - Provide tamper-proof capturing without affecting traffic flow.
  9. *Security Information and Event Management (SIEM) Systems*\
    - Tools like Splunk, ELK Stack, QRadar, ArcSight collect and correlate network logs.
    - Enable large-scale forensic analysis and event correlation.
= 19. Explain malware analysis and identify ways to trace data hiding techniques with network flows.
== 1. Malware Analysis
Malware analysis is the process of examining malicious software to understand its behaviour, purpose, and impact on a system. It helps forensic investigators identify how malware infects a device, what data it targets, and how it communicates with attackers. Malware analysis supports incident response, evidence collection, and threat detection.
  1. *Static Analysis*\
    The malware is examined without executing it. Investigators inspect file headers, strings, hashes, and API calls to identify suspicious functions or embedded resources. Static analysis is useful for quick identification of malware families and their capabilities.
  2. *Dynamic Analysis*\
    The malware is executed in a controlled sandbox environment to observe real-time behaviour such as file creation, process injection, registry changes, and network communication. This reveals how the malware interacts with the system.
  3. *Code/Reverse Engineering*\
    Advanced analysis uses tools like disassemblers and debuggers to study the internal code. Reverse engineering helps uncover hidden logic, encryption routines, anti-forensic methods, and persistence mechanisms.
== 2. Tracing Data Hiding Techniques 
Attackers commonly hide data to avoid detection and forensic analysis. They exploit hidden storage areas not included in conventional searches. Three major areas used for hiding data are:
  1. *Host Protected Area (HPA)*\
    - HPA is a hidden part of a hard disk that is not normally visible to the operating system.
    - Attackers can store confidential files or malware components in the HPA, making them invisible to standard forensic tools.
    - Traditional imaging tools may skip the HPA unless explicitly instructed.
    - If malware executes from HPA, investigators may detect suspicious outbound connections or C2 communication even though the file itself is hidden on disk.
  2. *Slack Space*\
    - Slack space is the unused portion of a cluster between the end of a file and the end of the allocated block.
    - Hackers use it to hide fragments of text, images, or small pieces of malware.
    - Since slack space isn't normally visible in the file listing, hidden data can remain undetected.
    - Recovered slack-space data may reveal configuration files, IP addresses, or URLs that point to malicious network flows.
  3. *Alternate Data Streams (ADS)*\
    - ADS is a feature of the NTFS file system that allows additional data to be stored in a file without changing its visible size.
    - Attackers exploit ADS to hide executables or scripts behind innocent-looking files. Example: notepad.txt:hidden.exe.
    - Windows Explorer does not show ADS content, making it an effective hiding place.
    - Hidden malware stored in ADS may generate abnormal network flows-DNS queries, TCP connections, or data exfiltration-despite the main visible file appearing harmless.
== 3. How Network Flow Analysis Helps Detect Hidden Data
Even though the hidden content is stored in HPA, slack space, or ADS, malware must still communicate to perform actions like exfiltration or receiving commands. Network flows can reveal:
  - Unusual outbound traffic to unknown IP addresses
  - Repeated beaconing to command-and-control servers
  - DNS or HTTP requests from processes whose files appear legitimate
  - Sudden spikes in data upload despite no visible file activity
  - Traffic associated with hidden executables
By correlating network behavioural anomalies with disk-level findings, investigators can uncover hidden data that traditional scanning would miss.
= 20. Phases of malware analysis and methods used for detecting hidden or embedded data.
== 1. Phases of Malware Analysis
Malware analysis is carried out in a structured manner to understand how malicious software behaves, spreads, and impacts a system. The major phases are:
  1. *Preliminary Analysis*\
    - In this initial phase, investigators examine the malware file without executing it.
    - Checking file hashes, size, headers, and metadata
    - Extracting readable strings
    - Identifying suspicious API calls
    - This gives a quick overview of malware type and potential capabilities.
  2. *Static Analysis*\
    - A deeper non-executable examination of the binary.
    - Disassembly to view code instructions
    - Understanding logical flow, functions, and algorithms
    - Identifying embedded resources like URLs, IPs, or encryption keys
    - Static analysis is useful for signature creation and detecting obfuscation.
  3. *Dynamic Analysis*\
    - The malware is executed inside a controlled sandbox environment.
    - Observing file system changes
    - Monitoring processes and registry modifications
    - Tracking network activity such as beaconing or C2 communication
    - This reveals real behavior and helps detect persistence mechanisms.
  4. *Code/Reverse Engineering*\
    - The most advanced phase, performed when deeper insights are needed.
    - Step-by-step debugging
    - Identifying encryption routines, anti-debug tricks, packing, and hidden logic
    - Recovering hidden strings and algorithms
    - Reverse engineering is crucial for understanding complex malware like ransomware or rootkits.
== 2. Methods Used for Detecting Hidden or Embedded Data
Attackers often hide data inside areas not examined by normal tools. Forensics relies on specialized techniques to uncover such concealed information.
  1. *Host Protected Area Analysis*\
    - The HPA is a hidden region on a disk not normally accessible by the OS.
    - Attackers store hidden files or malware components here.
    - Tools must explicitly scan for HPA regions since traditional imaging may skip them.
  2. *Slack Space Analysis*\
    - Slack space is the unused portion of a disk cluster after the end of a file.
    - Hackers hide small fragments of data here.
    - Forensic tools scan slack space to extract such embedded content.
  3. *Alternate Data Streams in NTFS*\
    - NTFS allows files to carry extra data streams without affecting visible size.
    - Attackers hide executables or scripts using syntax like:
      ```bash file.txt:hidden.exe```
    - ADS is invisible to standard file listings and must be detected using specialized NTFS analysis tools.
  4. *Steganography Detection*\
    - Embedded data may be hidden inside images, audio, or video.
    - Detection involves:
      - Checking file size anomalies
      - Statistical analysis of pixel or sample patterns
      - Comparing originals with suspected modified files
  5. *Signature and Entropy Analysis*\
    - Hidden or embedded data often displays irregular structure.
    - High entropy indicates encrypted or packed content
    - Unusual file signatures help detect appended or injected data
  6. *File Carving and Metadata Inspection*\
    - Forensic tools carve out embedded content by scanning raw disk space for known headers/footers.
    - Useful for recovering hidden graphic files, thumbnails, or EXIF metadata
    - Supports identification of tampering or hidden fragments
= 21. Explain the acquisition for mobile device and sim card data.
Acquisition is the process of collecting data from a mobile device and its SIM card in a forensically sound manner so that the original evidence remains intact and admissible in court. The goal is to extract all relevant information without altering the source.
== 1. Acquisition Procedure for Mobile Devices
1. *Securing and Isolating the Device*\
  - The device is first isolated from networks to prevent remote wiping or incoming data. Airplane mode, Faraday bags, or disabling Wi-Fi, mobile data, and Bluetooth are used. The device condition, model, IMEI, and screen state are documented.
2. *Maintaining Chain of Custody*\
  - All handlers, timestamps, and actions taken on the device are recorded. This ensures the evidence is legally admissible.
3. *Choosing the Appropriate Acquisition Method*\
  - Logical Acquisition: Extracts accessible data such as contacts, messages, call logs, app data.
  - File System Acquisition: Retrieves file system structure, databases, and application folders.
  - Physical Acquisition: Creates a bit-by-bit copy of the device memory, including deleted data.
  - Manual Acquisition: Screens are photographed when other methods cannot be used.
4. *Using Certified Forensic Tools*\
  - Tools like Cellebrite UFED, Magnet AXIOM, XRY, Oxygen Forensics extract data safely. These tools prevent modification of original data through built-in protection mechanisms.
5. *Bypassing Locks (If Permitted legally)*\
  - Recovery mode, exploit-based unlocking, JTAG, chip-off techniques, or vendor-supported methods are used if the device is locked or encrypted. This step must follow legal procedures.
6. *Hashing and Verification*\
  - Once acquired, the extracted data is hashed using MD5/SHA-1/SHA-256 to verify that the copy matches the original. Identical hashes confirm integrity.
7. *Documentation*\
  - Every action, tool, acquisition mode, and result is thoroughly documented for the final forensic report.
== 2. Acquisition Procedure for SIM Cards
1. *SIM Documentation and Removal*\
  - The SIM is removed carefully and details such as ICCID, IMSI, serial number, and carrier information are recorded.
2. *SIM Card Imaging*\
  - A forensic SIM reader connects the card to a forensic workstation. A bit-stream copy is created to avoid altering the original SIM content.
3. *Extracting Stored Data*\
  - Common recoverable data includes:
    - Contacts stored on the SIM
    - SMS messages
    - IMSI and authentication keys
    - Last dialed/received numbers
    - Network information and location-related identifiers
4. *Handling PIN/PUK Protection*\
  - If the SIM is locked, investigators obtain PUK codes or use authorised unlocking procedures from the service provider following legal guidelines.
5. *Hashing and Integrity Checking*\
  - Hash values are generated for the SIM image to prove that no modification occurred during extraction.
6. *Secure Storage and Reporting*\
  - The original SIM is sealed, labeled, and stored securely. All steps-tools used, extracted data, timestamps-are entered in the forensic report.
= 22. Write a note on mobile mobile device acquisition techniques for Android and iOS platform with their limitations.
Mobile device acquisition is the process of extracting data from smartphones in a forensically sound manner. Because Android and iOS differ in architecture, security models, and file systems, the acquisition techniques and their limitations vary across both platforms.
== 1. Android Mobile Device Acquisition Techniques
1. *Logical Acquisition*\
  - Extracts user-level data such as messages, contacts, call logs, app data through standard APIs.
  - Limitations:
    - Cannot recover deleted data.
    - Restricted if device is locked or encrypted.
    - App sandboxing may prevent access to internal databases.
2. *File System Acquisition*\
  - Extracts the entire file system structure including app folders, SQLite databases, configuration files, and media.
  - Limitations:
    - Requires elevated permissions.
    - Rooting may alter timestamps or violate forensic integrity.
    - Not all partitions or system areas are accessible.
3. *Physical Acquisition*\
  - Creates a bit-by-bit copy of the entire device storage, including deleted files, unallocated space, and hidden data.
  - Limitations:
    - Increasingly difficult due to full-disk encryption on modern Android devices.
    - Chip-off and JTAG methods may be risky and can damage hardware.
    - Lock screens and secure boot mechanisms complicate access.
4. *ADB Based Extraction*\
  - Uses USB debugging to extract certain categories of data.
  - Limitations:
    - Requires USB debugging to be enabled.
    - Limited access; cannot retrieve system-level or deleted data.
    - Ineffective on newer Android versions due to tightened security.
== 2. iOS Mobile Device Acquisition Techniques
1. *iTunes/Backup-Based Logical Acquisition*\
  - Extracts data from an unencrypted or password-known iTunes/iCloud backup.
  - Limitations:
    - Encrypted backups require the backup password.
    - Deleted data is not recovered.
    - Some sensitive app data may not be included in backups.
2. *File System Acquisition (Jailbroken Devices)*\
  - Full access to internal directories such as app containers, logs, and system files.
  - Limitations:
    - Jailbreak not available for all iOS versions/devices.
    - Jailbreaking alters system state, affecting forensic integrity.
    - Can void evidence authenticity if not carefully documented.
3. *Physical Acquisition (Limited to Older iPhones)*\
  - Complete dump of the flash storage, including deleted data.
  - Limitations:
    - Not supported on newer iPhones with strong Secure Enclave encryption.
    - Requires advanced expertise and risk of device damage.
    - Full-disk encryption prevents access to sensitive areas without passcode.
4. *AFU and BFU Extraction (After/Before First Unlock)*\
  - Advanced commercial tools extract data based on device unlock state.
  - Limitations:
    - BFU extraction yields very little data.
    - AFU extraction still requires the passcode in most cases.
    - Apple’s security architecture restricts file access even for forensic tools.
== 3. Common Limitations Across Both Platforms
- *Full Disk Encryption:* Makes physical acquisition extremely difficult without user authentication.
- *Secure Boot/Trusted Execution Environments:* Prevents low-level access to storage.
- *App Sandboxing:* Limits access to app-specific data.
- *Frequent OS Updates:* Break existing forensic methods.
- *Cloud Syncing:* Many artifacts are stored in the cloud, not on the device.
= 23. Write challenges faced during mobile forensics
Mobile forensics involves extracting and analysing data from smartphones and handheld devices, but investigators face several challenges due to rapid technological changes, security features, and device diversity.
1. *Device Diversity:*
There are thousands of mobile models with different hardware, operating systems, file systems, and chipsets. A single forensic tool cannot support all devices, making standardisation difficult.
2. *Strong Security and Encryption:*
Modern smartphones use advanced encryption. Without the passcode, it becomes extremely difficult to extract data.
3. *Locked or Damaged Devices:*
If a device is password-protected, damaged, or has biometric locks, accessing internal storage becomes challenging.
4. *Frequent OS Updates:*
Constant updates to Android and iOS introduce new security patches that may break existing forensic methods. Tools quickly become outdated.
5. *Cloud Storage and Syncing:*
Much user data is stored in the cloud, not on the device. Accessing cloud data requires additional legal permissions and credentials, complicating evidence extraction.
6. *Volatile Data:*
Mobile devices store volatile data in RAM which may disappear when the phone powers off or restarts. Capturing such data requires immediate action.
7. *Third-Party Apps:*
Apps like WhatsApp, Telegram, Signal, and social media platforms use encryption and proprietary storage formats. Extracting chats or metadata is difficult and often restricted.
8. *Remote Wipe and Auto-Delete Features:*
Phones may be configured to wipe data after failed login attempts or allow remote wiping from linked accounts. This can destroy evidence before acquisition.
9. *Data Volume and Complexity:*
Modern devices store large amounts of data-photos, videos, app data, location history-requiring significant time and processing power for analysis.
10. *Legal and Privacy Restrictions:*
Accessing a mobile device may involve sensitive personal information. Investigators must follow strict legal procedures for search, seizure, and privacy laws to avoid violating rights.
= 24. Explain the acquisition procedure for mobile device and sim card data.
Acquisition is the process of extracting data from a mobile device and its SIM card in a forensically sound manner so that the evidence remains authentic, complete, and legally acceptable.
1. Acquisition Procedure for Mobile Devices
  1. Securing and Documenting the Device:
      The device is first isolated to prevent remote access, network syncing, or remote wiping. Airplane mode, Faraday bags, or disabling connectivity are used. The device condition, model, serial number, and screen state are documented.
  2. Maintaining Chain of Custody:
      All handlers, timestamps, and actions performed on the device are recorded to ensure the evidence is admissible in court.
  3. Identifying Acquisition Method:
      Depending on the device type, lock status, and OS version, investigators choose from
      - Logical acquisition - extracts accessible files, contacts, messages, logs.
      - File-system acquisition - obtains file system structure, databases, app data.
      - Physical acquisition - bit-by-bit copy of flash memory, including deleted data.
      - Manual acquisition - photographing the screen when other methods fail.
  4. Bypassing Locks or Permissions:
    If needed, tools or legal permissions are used to bypass screen locks, password protection, or encryption. Techniques include recovery mode access, JTAG, chip-off, or vendor-specific forensic tools.
  5. Using Certified Forensic Tools:
    Tools like Cellebrite UFED, XRY, Magnet AXIOM, or Oxygen Forensics extract data securely. These tools prevent modification of original data and create secure forensic images.
  6. Verification and Hashing:
    The extracted data is hashed to prove that the acquired copy is identical to the original content.
  7. Documentation:
    Every step-methods used, tools, configurations, and extracted datasets-is documented for reporting and future reference.
2. Acquisition Procedure for SIM Card Data
  1. SIM Isolation and Documentation:
    The SIM card is removed carefully and its identifiers are recorded.
  2. SIM Card Imaging:
    A SIM card reader is used to connect the SIM to a forensic workstation. A bit-stream copy of the SIM is created to ensure non-destructive acquisition.
  3. Extracting Stored Data:
    Forensic tools extract SIM data such as:
      - Contacts stored in SIM memory
      - SMS messages
      - IMSI and authentication keys
      - Location information
      - Network information and service provider details
  4. Handling PIN/PUK Locks:
    If the SIM is locked, investigators may use authorised unlocking procedures or PUK codes obtained legally from the service provider.
  5. Hashing and Validation:
    Hash values are generated for the SIM image to ensure integrity and prove that no alteration has occurred.
  6. Proper Storage and Reporting:
    The original SIM is sealed, stored securely, and all acquisition steps are documented clearly in the forensic report.
= 25. Describe cyber law in India with respect to data privacy, investigation, and digital evidence.
Cyber law in India is primarily governed by the Information Technology Act, IT Act 2000 and its amendments. It provides the legal framework for regulating digital activities, protecting data, guiding cyber-crime investigations, and ensuring admissibility of electronic evidence.
1. *Data Privacy*\
  The IT Act along with the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 lays the foundation for data privacy in India. Overall, Indian cyber law aims to safeguard personal information and ensure responsible data handling.
    - Organisations collecting personal data must follow lawful, fair, and informed practices.
    - Sensitive data must be protected using strong security measures.
    - Section 43A holds companies liable for negligence if they fail to protect personal data, resulting in compensation to affected individuals.
    - The Digital Personal Data Protection Act (DPDP Act), 2023 further strengthens privacy rights by introducing consent-based data processing, data-principal rights, and obligations for data fiduciaries.

2. *Cyber-Crime Investigation*\
  The IT Act provides legal powers and procedures for investigating online offences. These provisions create a structured legal environment for investigating digital offences while ensuring accountability.
    - Section 66 deals with various computer-related offences such as hacking, identity theft, impersonation, and data tampering.
    - Section 69 empowers authorised agencies to intercept, monitor, or decrypt information in the interest of national security or crime prevention, but only with proper legal authorization.
    - Law enforcement agencies can search and seize digital devices under Section 80, allowing police officers to enter premises and arrest suspects without a warrant in certain cyber-crime situations.
    - Cyber Forensic Labs (CFLs), CERT-In, and specialised cyber-crime cells support technical analysis during investigations.

3. *Digital Evidence*\
  Cyber law in India also recognises and regulates electronic evidence. Proper chain of custody, forensic imaging, and secure handling are essential to ensure that digital evidence is not altered or tampered with during investigation.
    - Under the Indian Evidence Act, Section 65B, electronic records are admissible in court only if accompanied by a Section 65B certificate.
    - The certificate confirms the authenticity of the electronic record, the device used to produce it, and the integrity of the data.
    - The IT Act also validates electronic signatures and digital signatures, enabling legally binding digital communication and transactions.
= 26. Write a short note on provision of IT act 2000 ( ammended 2008 ) which deals with cyber investigation and digital evidence admisibility.
The Information Technology Act, 2000, along with its significant 2008 amendment, provides the legal framework in India for conducting cyber-crime investigations and ensuring the admissibility of electronic evidence in courts. These provisions support lawful access, investigation procedures, and validation of digital records.
1. *Cyber Investigation Provisions*
  1. Section 66 Series - Computer-Related Offences
    - Sections 66, 66B, 66C, 66D, 66E deal with offences such as hacking, identity theft, cyber fraud, impersonation, and violation of privacy.
    - These sections give investigators legal grounds to initiate cases and prosecute offenders.
  2. Section 69 - Interception, Monitoring and Decryption
    - Authorises the government and law enforcement agencies to intercept or decrypt information for national security, investigation, and crime prevention.
    - Requires proper written authorisation, ensuring checks and balances.
  3. Section 69A - Blocking of Websites/Content
    - Allows the government to block online content for public order, investigation, or security purposes.
    - Helps investigators curb illegal content during ongoing cases.
  4. Section 69B - Monitoring of Traffic Data
    - Permits monitoring and collection of traffic data for cyber security and threat assessment.
    - Useful for tracing attackers, analysing network intrusions, and reconstructing cyber-attacks.
  5. Section 80 - Search and Seizure Powers
    - Allows police officers to enter premises, search, and arrest without warrant in certain cyber-crime scenarios.
    - Enables quick seizure of digital devices to prevent evidence destruction.
2. *Provisions for Digital Evidence Admissibility*
  1. Section 65B of the Indian Evidence Act (linked to IT Act)
    - Recognises electronic records as legally valid evidence.
    - Requires a Section 65B certificate to prove authenticity, the device used, and that the data has not been altered.
    - Ensures digital evidence is accepted in court with proper documentation.
  2. Section 3 & 4 - Legal Recognition of Digital Signatures and E-records
    - Grants legal validity to electronic documents and digital signatures, allowing them to be treated like physical records.
  3. Section 7 - Retention of Electronic Records
    - Allows electronic records to be stored and preserved as valid evidence for investigation and future reference.
  4. Section 79A - Establishment of Examiner of Electronic Evidence
    - Introduced in the 2008 amendment.
    - Recognises government-approved digital forensic labs as official experts whose reports and certifications hold high evidentiary value in court.
= 27. Write a note on forensic report writing format.
Forensic report writing format is:
```text
1 Executive Summary
1.1 Case Number
1.2 Name of authors, investigators and examiners
1.3 Purpose of investigation
1.4 Significant Findings
2 Investigation Objectives
3 Details of the investigation
3.1 Date and time of incident
3.2 Date and time of report
4 Investigation Process
4.1 Date and time the investigators were assigned
4.2 Alloted Investigators
4.3 Nature of claim
5 Evidence Information
5.1 Location of the evidence
5.2 List of collected evidences
5.3 Tools involved in collecting the evidence
5.4 Preservation of the evidence
6 Evaluvation and analysis process
6.1 Initial evaluvation of the evidence 
6.2 Investigative techniques 
6.3 Analysis of the computer evidence 
7 Relevant Findings
8 Supporting Files
8.1 Attachments and appendices 
8.2 Full path of the important files 
9 Attacker’s methodology
10 Recommendations
```
= 28. Discuss how file time stamp metadata is used as evidence in legal proceedings with challenges. 
File timestamp metadata includes Created, Modified, and Accessed times. These timestamps are automatically generated by the operating system and record when a file was created, last edited, or last opened. In digital forensics and legal proceedings, timestamp metadata is a crucial source of evidence because it helps reconstruct user activity and establish timelines of events.
- *Use of Timestamp Metadata as Evidence*
  1. Establishing a Timeline of Events:
    Timestamps allow investigators to determine when a file was created, modified, or accessed. This helps reconstruct the sequence of activities during a cybercrime, such as when malware was installed or when sensitive files were copied.
  2. Proving User Activity or Intent:
    Metadata can show if a suspect accessed or changed a file at a specific time, supporting allegations of data theft, document forgery, unauthorised access, or destruction of evidence.
  3. Corroborating Evidence:
    Timestamp information can be matched with logs, network records, CCTV, or email timestamps to strengthen the overall case and confirm the suspect’s presence or actions.
  4. Detecting Anti-Forensic Techniques:
    If timestamps appear inconsistent, investigators may identify attempts at tampering or the use of time-altering tools.
  5. Attributing Actions to Devices or Users:
    Different users or systems may leave different timestamp patterns. This helps link specific actions to particular accounts or machines involved in an incident.
- *Challenges in Using Timestamp Metadata*
  1. Easily Altered:
    Timestamps are not secure and can be changed intentionally using anti-forensic tools like "touch", timestomping malware, or system clock manipulation. This reduces their reliability as standalone evidence.
  2. System and Software Behaviour:
    Operating systems automatically update timestamps during normal use-for example, opening a file may update the “Accessed” timestamp. These automatic changes can mislead investigations.
  3. Timezone and Clock Issues:
    Incorrect system clocks, timezone differences, or daylight-saving adjustments can cause anomalies or conflicting timelines if not properly accounted for.
  4. Metadata Loss During Copying/Transfer:
    Copying files between devices or downloading from the internet may change timestamps, making it difficult to determine the original time values.
  5. Different File Systems Handle Timestamps Differently:
    FAT32, NTFS, ext4, APFS, and mobile OS file systems store timestamps differently. Some don’t store milliseconds or access times, leading to incomplete data.
  6. Volatility and Incomplete Records:
    Some timestamps may be disabled for performance reasons, leaving gaps in the forensic record. Cache cleaning or OS housekeeping may also overwrite metadata.
  7. Need for Corroboration:
    Timestamp metadata alone is rarely sufficient for conviction. Courts often require it to be supported by log files, system traces, or witness testimony.