compression,
Moderator: SG Admins
compression,
ok im having trouble with these zip folders, when i download them, or when i make them myself, they always end up saying 'the folders is invalid or corrupt'
any ideaS? im not reinstalling:P
im running windows update as we speak, 11items to be installed, but it looks like most of them have to be done seperately:<
any ideaS? im not reinstalling:P
im running windows update as we speak, 11items to be installed, but it looks like most of them have to be done seperately:<
Go back to winzip. It is a pain in the rectum to run through 3 screen typing crap just to extract a file....
"And regrettably your planet is one of those scheduled for demolition"
Rgds
Mike
Dead-Fish, Deep Sea Daddies...
My DVDs
Rgds
Mike
Dead-Fish, Deep Sea Daddies...
My DVDs
Are you sure? THe compression algorythm in PKZIP hasn't changed since about 1991. Winzip is jsut a front end with bells and whistles...
"And regrettably your planet is one of those scheduled for demolition"
Rgds
Mike
Dead-Fish, Deep Sea Daddies...
My DVDs
Rgds
Mike
Dead-Fish, Deep Sea Daddies...
My DVDs
- Kulgan
- LAN Admin-Monkey
- Posts: 301
- Joined: Mon Oct 28, 2002 12:02 pm
- Location: Winchester, UK
- Contact:
Shrek is in fact correct. I think. I'm absolutely sure that's the case with WinRAR - and I seem to remember something about new ZIPpy type stuff too...Seahorse wrote:Are you sure? THe compression algorythm in PKZIP hasn't changed since about 1991. Winzip is jsut a front end with bells and whistles...
K
Yes for Winrar, the compression algorythm changed from 2.xx to 3.xx for instance. This is untrue for zip (both PK & Win varieties) files. Here is a little experiment
My registered copy of PKZIP from 1993 (the last time it was updated) unpacked a test archive I just made using Winzip 8.1 SR1.
E:\DATA\DOSFLO~1\DOS>"E:\DATA\DOS Floppy\DOS\PKUNZIP.EXE" dos.zip
PKUNZIP (R) FAST! Extract Utility Version 2.04g 02-01-93
Copr. 1989-1993 PKWARE Inc. All Rights Reserved. Shareware Version
PKUNZIP Reg. U.S. Pat. and Tm. Off.
■ 80486 CPU detected.
■ XMS version 2.00 detected.
■ DPMI version 0.90 detected.
Searching ZIP: DOS.ZIP
PKUNZIP: (W18) Warning! C.BAT already exists. Overwrite (y/n/a/r)?y
Extracting: C.BAT
PKUNZIP: (W18) Warning! D.BAT already exists. Overwrite (y/n/a/r)?a
Extracting: D.BAT
Inflating: DS.BAT
Extracting: EA.BAT
Extracting: EC.BAT
As you can see a nine year old DOS version quite happily extracts a file made using a copy Winzip released in Oct 02. It is a shame they have not bothered to improve the compression with the improvements made in recent years, but it does wonders for compatibilty as you can see...
My registered copy of PKZIP from 1993 (the last time it was updated) unpacked a test archive I just made using Winzip 8.1 SR1.
E:\DATA\DOSFLO~1\DOS>"E:\DATA\DOS Floppy\DOS\PKUNZIP.EXE" dos.zip
PKUNZIP (R) FAST! Extract Utility Version 2.04g 02-01-93
Copr. 1989-1993 PKWARE Inc. All Rights Reserved. Shareware Version
PKUNZIP Reg. U.S. Pat. and Tm. Off.
■ 80486 CPU detected.
■ XMS version 2.00 detected.
■ DPMI version 0.90 detected.
Searching ZIP: DOS.ZIP
PKUNZIP: (W18) Warning! C.BAT already exists. Overwrite (y/n/a/r)?y
Extracting: C.BAT
PKUNZIP: (W18) Warning! D.BAT already exists. Overwrite (y/n/a/r)?a
Extracting: D.BAT
Inflating: DS.BAT
Extracting: EA.BAT
Extracting: EC.BAT
As you can see a nine year old DOS version quite happily extracts a file made using a copy Winzip released in Oct 02. It is a shame they have not bothered to improve the compression with the improvements made in recent years, but it does wonders for compatibilty as you can see...
"And regrettably your planet is one of those scheduled for demolition"
Rgds
Mike
Dead-Fish, Deep Sea Daddies...
My DVDs
Rgds
Mike
Dead-Fish, Deep Sea Daddies...
My DVDs
Considering the compression ratios available using JPG & MPEG not to mention Divx, I'm surprised that zip files are in use today.
I am unsure why some rocket scientist out there has not come up with say a "DivxPacker" programme, using the divx algorythm to power it.
me/ Goes surfing with Google for research purposes...
I am unsure why some rocket scientist out there has not come up with say a "DivxPacker" programme, using the divx algorythm to power it.
me/ Goes surfing with Google for research purposes...
"And regrettably your planet is one of those scheduled for demolition"
Rgds
Mike
Dead-Fish, Deep Sea Daddies...
My DVDs
Rgds
Mike
Dead-Fish, Deep Sea Daddies...
My DVDs
You'll probably find that DivX compression would be crap if applied to any type of data other than video....
mid_gen - www.the-midfield.com
Surely the codec doesn't know it's video, just 1's and 0's? just the same as the text file or whater it is you are compressing? At the the end of the day the compressor is looking for replicated patterns to write a dictionary and substitue...
"And regrettably your planet is one of those scheduled for demolition"
Rgds
Mike
Dead-Fish, Deep Sea Daddies...
My DVDs
Rgds
Mike
Dead-Fish, Deep Sea Daddies...
My DVDs
But there's a hell of a lot more replicated information in a video stream than in, say, a database. Think about it, just off the top of my head I'd say that roughly 50% of data every frame is identical to the last frame in most video, that makes it a helluva lot easier to compress the sucka. By comparison, a database or summat is pretty much totally random. Pete and Nickhave both done vidoe compression stuff, so no doubt they can contribute here
mid_gen - www.the-midfield.com
mpeg itself is already a very lossy algorithm that uses some real cute tricks to throw data away that the human eye+brain won't notice is missing. If you watch a live Digital data stream, as seen via Sky or DTT (as was OnDigital now BBC et al) wait for some video that involves panning or sideways movement. Now instead of looking directly into the centre of the screen, look at the edges. You will notice that the centre is sharp but the outside / background / not of direct interest bits around the edge will be severely blurred. This is not only due to the camera movement blur you get with analogue systems.
This is because as the human brain looks at what is actually the centre of all the attention, the bit that is not moving, the object being panned, that is where the data stream concentrates all it's efforts and ensures that that part of the screen remains at a higher detail setting.
The part on the outside edges that will be blurred anyway, due to camera panning, will be assigned a VERY low data rate.
Unfortunately this is very difficult to demosntrate with live video, but if you have seen this, as I have, with a constantly rotating video clip of a angel fish swimming in circles in front of some coral, it is VERY noticeable. The fish remins in sharp focus and high detail, but the coral SUDDENLY blurs the second the fish starts to move, even when the camera is NOT panning.
MPEG also uses a system of assigning known patterns of video blocks, a bit like the idea behind fractals, and searches out these predefined patterns in the original video data stream, any patterns that are not found, are assigned to the nearest pattern that fits. The data stream then transmits the pattern numbers rather than the data itself.
On top of this, when an object moves across the screen, the MPEG codec simply tells the decoder that the object defined by a certain boundary has just moved over by x number of pixels, rather than redrawing it all over again.
Also the codec keeps a memory of x number of frames plus and minus of each current frame, so that if the frame, say, 10 frames earlier is the same as the current one, it simply tells the decoder that it merely needs to pull out the frame from 10 frames ago out of its memory rather than resending that frame all over again.
That is why, when the data stream fails, you can sometimes get a repeating sequence of two frames causing some very odd effects, such a a hand violently waving or eyelids opening/closing very fast!
So... no, I don't think that video compression algorithms will work very well with "normal" binary data!
This is because as the human brain looks at what is actually the centre of all the attention, the bit that is not moving, the object being panned, that is where the data stream concentrates all it's efforts and ensures that that part of the screen remains at a higher detail setting.
The part on the outside edges that will be blurred anyway, due to camera panning, will be assigned a VERY low data rate.
Unfortunately this is very difficult to demosntrate with live video, but if you have seen this, as I have, with a constantly rotating video clip of a angel fish swimming in circles in front of some coral, it is VERY noticeable. The fish remins in sharp focus and high detail, but the coral SUDDENLY blurs the second the fish starts to move, even when the camera is NOT panning.
MPEG also uses a system of assigning known patterns of video blocks, a bit like the idea behind fractals, and searches out these predefined patterns in the original video data stream, any patterns that are not found, are assigned to the nearest pattern that fits. The data stream then transmits the pattern numbers rather than the data itself.
On top of this, when an object moves across the screen, the MPEG codec simply tells the decoder that the object defined by a certain boundary has just moved over by x number of pixels, rather than redrawing it all over again.
Also the codec keeps a memory of x number of frames plus and minus of each current frame, so that if the frame, say, 10 frames earlier is the same as the current one, it simply tells the decoder that it merely needs to pull out the frame from 10 frames ago out of its memory rather than resending that frame all over again.
That is why, when the data stream fails, you can sometimes get a repeating sequence of two frames causing some very odd effects, such a a hand violently waving or eyelids opening/closing very fast!
So... no, I don't think that video compression algorithms will work very well with "normal" binary data!
Politics: 'Poli' in Latin means 'many' and 'tics' means 'bloodsucking creatures'.
-
- LAN Admin-Monkey
- Posts: 259
- Joined: Sun Oct 27, 2002 5:03 pm
- Location: Basingstoke
- Contact:
I was gonna get back to this one once I'd dug out the notes fo mlast years multimedia systems course. However, seeing as he got there first, I can abbriviate most of this to "Whats sparks said."
A lot of data compression algorithms make use of the fact that the structure of the data is known. I'm sure some of you know that MP3 is based on an acoustic model of the human hearing, and can achieve the level of compression that it does by throwing away unneeded data. If you were to attempt to run raw image data through an MP3 encoding engine, and then back again, the result would be an unintelligible mess, as the MP3 encode would throw away all kinds of data. A similar point applies to lossy JPEG compression, as it makes use of the colour perception of the human eye to throw away unneeded image data.
One of the major point of MPEG is it's designed to work on moving video streams, a point amde by the learned Mr Sparks there. If you were to run a binary data file though media player, for instance (ignoring format errors), you'd probably get something that looks like constantly changing static. Each frame would bear little or no coherence to the last, or next. This coherence is most of what MPEG relies on to achieve it's compression, and it's entirely possible that in this case, the data stream will come out larger than the original.
http://rnvs.informatik.tu-chemnitz.de/~ ... _tech.html seems to be a good bash at explaining exactly what's going on, if anyone's interested in going deeper, though I didn't read it all... If anyone's more interested than that, gimme a shout and I'll dig out me old lecture notes
A lot of data compression algorithms make use of the fact that the structure of the data is known. I'm sure some of you know that MP3 is based on an acoustic model of the human hearing, and can achieve the level of compression that it does by throwing away unneeded data. If you were to attempt to run raw image data through an MP3 encoding engine, and then back again, the result would be an unintelligible mess, as the MP3 encode would throw away all kinds of data. A similar point applies to lossy JPEG compression, as it makes use of the colour perception of the human eye to throw away unneeded image data.
One of the major point of MPEG is it's designed to work on moving video streams, a point amde by the learned Mr Sparks there. If you were to run a binary data file though media player, for instance (ignoring format errors), you'd probably get something that looks like constantly changing static. Each frame would bear little or no coherence to the last, or next. This coherence is most of what MPEG relies on to achieve it's compression, and it's entirely possible that in this case, the data stream will come out larger than the original.
http://rnvs.informatik.tu-chemnitz.de/~ ... _tech.html seems to be a good bash at explaining exactly what's going on, if anyone's interested in going deeper, though I didn't read it all... If anyone's more interested than that, gimme a shout and I'll dig out me old lecture notes
Pete
"If at first you don't succeed, call it Version 1"
"If at first you don't succeed, call it Version 1"
LZX has not been updated for a few years, but compressed files used to be considerably smaller tha Zip files. I used to use it in the days of my Amiga. I notice they have a Windows version at their site, but only for UNpacking.
It would be interesting to see some comparissons made using a loss-less packers to see the most efficient. Fortunately this is possible over at ACT. The best in summary were:
SBC 0.968, RAR (Win32) 3.00b5, ASPACK 2.00.1, UPX 0.99.2w, MONKEY'S AUDIO 3.96 & ERI32 5.1fre. SBC came top in 4 out of 9 tests. Can't say i have every heard of it though. Still, if their comparison tables are even remotely accurate...
It would be interesting to see some comparissons made using a loss-less packers to see the most efficient. Fortunately this is possible over at ACT. The best in summary were:
SBC 0.968, RAR (Win32) 3.00b5, ASPACK 2.00.1, UPX 0.99.2w, MONKEY'S AUDIO 3.96 & ERI32 5.1fre. SBC came top in 4 out of 9 tests. Can't say i have every heard of it though. Still, if their comparison tables are even remotely accurate...
"And regrettably your planet is one of those scheduled for demolition"
Rgds
Mike
Dead-Fish, Deep Sea Daddies...
My DVDs
Rgds
Mike
Dead-Fish, Deep Sea Daddies...
My DVDs