Important Question of Data Compression and Data Retrieval GTU | IMP for DCDR | DCDR gtu | 2161603.

Important Questions of DCDR



Chapter 1:- Compression Techniques.

  • Define: self-information, entropy, lossy compression. (3 Mark)
  • State the models for lossless compression and explain any one in. (4 Mark)
  • What is Data Compression? Compare Lossy Compression with Lossless Compression. (3 Mark)

Chapter 2:- Mathematical Preliminaries for Lossless Compression Models.

  • Write a short note on Uniquely decodable codes. (4 Mark)
  • Explain modeling and coding. Explain how this will help to reduce entropy with suitable example. (7 Mark)
  • Write a short note on Prefix Code. (3 Mark)
  • The probability model is given by P(a1) = 0.2, P(a2) = 0.3 and P(a3) = 0.5. Find the real valued tag for the sequence a1a1 a3 a2 a3 a1. (Assume cumulative probability function: F(0) = 0). (7 Mark)
  • Determine whether the following codes are uniquely decodable or not. 1. {0, 01, 11, 111} 2. {0, 10, 110, 111} (3 Mark)
  • Explain different types of models in Data Compression. (4 Mark)
  • Explain Diagram Coding with suitable example. (4 Mark)
  • Explain Markov Model with example. (7 Mark)

Chapter 3:- Huffman Coding.

  • Explain Huffman Coding in detail with example. Define minimum variance Huffman codes. (7 Mark)
  • Encode “aacdeaab” using Adaptive Huffman code. Derive Output string, Codes and final tree.
  • Generate GOLOMB code for m=9 and n=8 to 13. (7 Mark)
  • Write procedure to generate TUNSTALL code. Generate TUNSTALL code with probability of P(A)=0.6, P(B)=0.3, P(C)=0.1 and n=3 bits. (7 Mark)
  • Explain Huffman Coding with respect to minimum variance Huffman codes with separate trees. (7 Mark)
  • Write a different Application of Huffman Coding. (3 Mark)
  • Determine the minimum variance Huffman code with the given probabilities. (4 Mark)
    P(a1) = 0.2, P(a2) = 0.4, P(a3) = 0.2, P(a4) = 0.1 and P(a5) = 0.1.
  • Explain audio compression technique with suitable diagram. (7 Mark)
  • Explain Rice Codes in brief. (3 Mark)
  • Design a minimum variance Huffman code for a source that put out letter from an alphabet A={ a1, a2, a3, a4, a5, a6} with P(a1)=P(a2)=0.2, P(a3)=0.25, P(a4)=0.05, P(a5)=0.15,P(a6)=0.15.Find the entropy of the source, avg. length of the code and efficiency. Also comment on the difference between Huffman code and minimum variance Huffman code. (7 Mark)
  • Design a minimum variance Huffman code for a source that put out letter from an alphabet A={a1,a2,a3,a4,a5,a6} with P(a1)=P(a2) = 0.2 , P(a3)=0.25,P(a4)=0.05,P(a5)=0.15,P(a6)=0.15.Find the entropy of the source, avg. length of the code and efficiency. (7 Mark)



Chapter 4:- Arithmetic Coding.

  • Define Arithmetic Coding. Encode and Decode “BACBA” with arithmetic coding. (P(A)=0.5,P(B)=0.3,P(C)=0.2). (7 Mark)
  • Encode and Decode “AABBC” with arithmetic coding. (P(A)=0.6, P(B)=0.3, P(C)=0.1). (7 Mark)
  • Write pseudocode for integer arithmetic encoding and decoding algorithm. (7 Mark)
  • Compare Arithmetic Coding with Huffman Coding. (4 Mark)
  • Write the method to generate a tag in Arithmetic Coding. (4 Mark)
  • Explain Uniqueness and Efficiency of the Arithmetic Code. (3 Mark)


Chapter 5:- Dictionary Techniques.

  • Given an initial dictionary consisting of the letters a b r y ḇ, encode the following message using the LZW algorithm: aḇbarḇarrayḇbyḇbarrayarḇbay. (7 Mark)
  • Encode the following sequence using the LZ77 and LZ78 algorithm: (7 Mark)
    ḇarrayarḇbarḇbyḇbarrayarḇba
    Assume you have a window size of 30 with a look-ahead buffer of size 15.
    Furthermore assume that C(a)=1, C(b)=2, C(ḇ)=3, C(r)=4, and C(y)=5. *
  • Encode the following sequence using Diagram Coding of Static Dictionary method (Generate for 3 bit): abracadabra. (7 Mark)
  • Given an initial dictionary Index 1=w, 2=a, 3=b, encode the following message using the LZ78 algorithm: wabba/bwabba/bwabba/bwabba/bwoo/bwoo/bwoo. (7 Mark)
  • Explain LZ77 with suitable example. (7 Mark)
  • Encode the following sequence cabracadabrarrarrad using LZ77 Method. Assume window size of 13 and look ahead buffer of size 6. (7 Mark)

Chapter 6:- Predictive Coding.

  • Encode the sequence etaḇcetaḇandḇbetaḇceta using Burrows-Wheeler transform and move to front coding.* (7 Mark)
  • Write a short note on Old JPEG standard and JPEG-LS. (7 Mark)
  • Explain CALIC. (3 Mark)
  • Explain prediction with partial match in short. (3 Mark)
  • Explain Facsimile Encoding and Exclusion principle in detail. (4 Mark)

Chapter 7:- Mathematical Preliminaries for Lossy Coding.

  • Explain nonuniform quantization. (3 Mark)
  • Explain pdf optimized quantization. (3 Mark)
  • Explain structured vector quantizers. (3 Mark)
  • Explain pyramid vector quantization. (3 Mark)
  • Explain adaptive quantization with any one approach. (4 Mark)



Chapter 8:- Vector Quantization.

  • Explain Scalar Quantization in detail. (7 Mark)
  • Explain Vector Quantization in detail. (7 Mark)
  • Explain Linde-Buzo-Gray algorithm in detail. (4 Mark)

Chapter 9:- Boolean retrieval.

  • Explain and compare Incident matrix and Inverted index with example. (7 Mark)
  • Explain skip pointers and Phrase queries with example. (7 Mark)
  • Explain Tokenization. (3 Mark)
  • Explain Information Retrieval in detail. (4 Mark)
  • Write a short note on Phrase queries with example. (4 Mark)
  • Explain stemming and lemmatization with suitable example. (7 Mark)

Chapter 10:- XML retrieval.

  • Explain challenges in XML information retrieval. (7 Mark)
  • Explain Vector Space Model in XML. (4 Mark)
  • Write a short note on : I) Positional Index II) data-centric XML retrieval. (7 Mark)

Post a Comment

0 Comments