parapup
Mar 31, 04:03 PM
Google/Android can't win in Gruber and his follower's minds. If they control to reduce UI variations - it's not OPEN anymore. If they don't control then there are complaints about carrier crapware. Either way Gruber and co. exist to move the goal posts to suit their cult. iOS has favorable numbers - numbers FTW!! Oh wait that's no longer true - numbers hardly matter!! Android has UI variances because of lack of Google control - BAAAD stuff! Google is putting control in place to promote more uniformity - GAAAWWD AWFUL BAIT and SWITCH!!
So nothing to see here, move along.
So nothing to see here, move along.
janstett
Oct 23, 11:44 AM
Unfortunately not many multithreaded apps - yet. For a long time most of the multi-threaded apps were just a select few pro level things. 3D/Visualization software, CAD, database systems, etc.. Those of us who had multiprocessor systems bought them because we had a specific software in mind or group of software applications that could take advantage of multiple processors. As current CPU manufacturing processes started hitting a wall right around the 3GHz mark, chip makers started to transition to multiple CPU cores to boost power - makes sense. Software developers have been lazy for years, just riding the wave of ever-increasing MHz. Now the multi-core CPUs are here and the software is behind as many applications need to have serious re-writes done in order to take advantage of multiple processors. Intel tried to get a jump on this with their HT (Hyper Threading) implementation that essentially simulated dual-cores on a CPU by way of two virtual CPUs. Software developers didn't exactly jump on this and warm up to it. But I also don't think the software industry truly believed that CPUs would go multi-core on a mass scale so fast... Intel and AMD both said they would, don't know why the software industry doubted. Intel and AMD are uncommonly good about telling the truth about upcoming products. Both will be shipping quad-core CPU offerings by year's end.
What you're saying isn't entirely true and may give some people the wrong idea.
First, a multicore system is helpful when running multiple CPU-intensive single-threaded applications on a proper multitasking operating system. For example, right now I'm ripping CDs on iTunes. One processor gets used a lot and the other three are idle. I could be using this CPU power for another app.
The reality is that to take advantage of multiple cores, you had to take advantage of threads. Now, I was doing this in my programs with OS/2 back in 1992. I've been writing multithreaded apps my entire career. But writing a threaded application requires thought and work, so naturally many programmers are lazy and avoid threads. Plus it is harder to debug and synchronize a multithreaded application. Windows and Linux people have been doing this since the stone age, and Windows/Linux have had usable multiprocessor systems for more than a decade (it didn't start with Hyperthreading). I had a dual-processor 486 running NT 3.5 circa 1995. It's just been more of an optional "cool trick" to write threaded applications that the timid programmer avoids. Also it's worth noting that it's possible to go overboard with excessive threading and that leads to problems (context switching, thrashing, synchronization, etc).
Now, on the Mac side, OS 9 and below couldn't properly support SMP and it required a hacked version of the OS and a special version of the application. So the history of the Mac world has been, until recently with OSX, to avoid threading and multiprocessing unless specially called for and then at great pain to do so.
So it goes back to getting developers to write threaded applications. Now that we're getting to 4 and 8 core systems, it also presents a problem.
The classic reason to create a thread is to prevent the GUI from locking up while processing. Let's say I write a GUI program that has a calculation that takes 20 seconds. If I do it the lazy way, the GUI will lock up for 20 seconds because it can't process window messages during that time. If I write a thread, the calculation can take place there and leave the GUI thread able to process messages and keep the application alive, and then signal the other thread when it's done.
But now with more than 4 or 8 cores, the problem is how do you break up the work? 9 women can't have a baby in a month. So if your process is still serialized, you still have to wait with 1 processor doing all the work and the others sitting idle. For example, if you encode a video, it is a very serialized process. I hear some work has been done to simultaneously encode macroblocks in parallel, but getting 8 processors to chew on a single video is an interesting problem.
What you're saying isn't entirely true and may give some people the wrong idea.
First, a multicore system is helpful when running multiple CPU-intensive single-threaded applications on a proper multitasking operating system. For example, right now I'm ripping CDs on iTunes. One processor gets used a lot and the other three are idle. I could be using this CPU power for another app.
The reality is that to take advantage of multiple cores, you had to take advantage of threads. Now, I was doing this in my programs with OS/2 back in 1992. I've been writing multithreaded apps my entire career. But writing a threaded application requires thought and work, so naturally many programmers are lazy and avoid threads. Plus it is harder to debug and synchronize a multithreaded application. Windows and Linux people have been doing this since the stone age, and Windows/Linux have had usable multiprocessor systems for more than a decade (it didn't start with Hyperthreading). I had a dual-processor 486 running NT 3.5 circa 1995. It's just been more of an optional "cool trick" to write threaded applications that the timid programmer avoids. Also it's worth noting that it's possible to go overboard with excessive threading and that leads to problems (context switching, thrashing, synchronization, etc).
Now, on the Mac side, OS 9 and below couldn't properly support SMP and it required a hacked version of the OS and a special version of the application. So the history of the Mac world has been, until recently with OSX, to avoid threading and multiprocessing unless specially called for and then at great pain to do so.
So it goes back to getting developers to write threaded applications. Now that we're getting to 4 and 8 core systems, it also presents a problem.
The classic reason to create a thread is to prevent the GUI from locking up while processing. Let's say I write a GUI program that has a calculation that takes 20 seconds. If I do it the lazy way, the GUI will lock up for 20 seconds because it can't process window messages during that time. If I write a thread, the calculation can take place there and leave the GUI thread able to process messages and keep the application alive, and then signal the other thread when it's done.
But now with more than 4 or 8 cores, the problem is how do you break up the work? 9 women can't have a baby in a month. So if your process is still serialized, you still have to wait with 1 processor doing all the work and the others sitting idle. For example, if you encode a video, it is a very serialized process. I hear some work has been done to simultaneously encode macroblocks in parallel, but getting 8 processors to chew on a single video is an interesting problem.
daneoni
Aug 27, 08:03 PM
Alright i'm off, i hope everyone gets what they wish for on tuesday, however wild. Cheers and here's to PowerBook G5s tomorrow.
Full of Win
Apr 10, 01:16 PM
I'll bet money that Apple will make FCP into what Express should be.
I think many are sharpening our digital pitch forks in preparation of the announcement from Apple.
I think many are sharpening our digital pitch forks in preparation of the announcement from Apple.
ThunderSkunk
Apr 10, 12:20 AM
Wow. You'd think a FCP Users group would be able to track down a halfway decent graphic artist to make their banner graphic...
theBB
Aug 11, 07:28 PM
Confused.
Can somebody explain me the differences between the cellphone market between the US and Europe.
Will a 'iPhone' just be marketed to the US or worldwide (as the iPod does)?
Well, let's see, about 20 years ago, a lot of countries in Europe, Asia and elsewhere decided on a standard digital cell phone system and called it GSM. About 15 years ago GSM networks became quite widespread across these countries. In the meantime US kept on using analog cell phones. Motorola did not even believe that digital cell phone had much of a future, so it decided to stay away from this market, a decision which almost bankrupted the company.
US started rolling out digital service only about 10 years ago. As US government does not like to dictate private companies how to conduct their business, they sold the spectrum and put down some basic ground rules, but for the most part they let the service providers use any network they wished. For one reason or another, these providers decided go with about 4 different standards at first. Quite a few companies went with GSM, AT&T picked a similar, but incompatible TDMA (IS=136?) standard, Nextel went with a proprietary standard they called iDEN and Sprint and Verizon went with CDMA, a radically different standard (IS-95) designed by Qualcomm. At the time, other big companies were very skeptical, so Qualcomm had to not only develop the underlying communication standards, but manufacture cell phones and the electronics for the cell towers. However, once the system proved itself, everybody started moving in that direction. Even the upcoming 3G system for these GSM networks, called UMTS, use a variant of CDMA technology.
CDMA is a more complicated standard compared to GSM, but it allows the providers to cram more users into each cell, it is supposedly cheaper to maintain and more flexible in some respects. However, anybody in that boat has to pay hefty royalties to Qualcomm, dampening its popularity. While creating UMTS, GSM standards bodies did everything they could to avoid using Qualcomm patents to avoid these payments. However, I don't know how successful they got in these efforts.
Even though Europeans here on these forums like to gloat that US did not join the worldwide standard, that we did not play along, that ours is a hodge podge of incompatible systems; without the freedom to try out different standards, CDMA would not have the opportunity to prove its feasibility and performance. In the end, the rest of the world is also reaping the benefits through UMTS/WCDMA.
Of course, not using the same standards as everybody else has its own price. The components of CDMA cell phones cost more and the system itself is more complicated, so CDMA versions of cell phones hit the market six months to a year after their GSM counterparts, if at all. The infrastructure cost of a rare system is higher as well, so AT&T had to rip apart its network to replace it with GSM version about five years after rolling it out. Sprint is probably going to convert Nextel's system in the near future as well.
I hope this answers your question.
Can somebody explain me the differences between the cellphone market between the US and Europe.
Will a 'iPhone' just be marketed to the US or worldwide (as the iPod does)?
Well, let's see, about 20 years ago, a lot of countries in Europe, Asia and elsewhere decided on a standard digital cell phone system and called it GSM. About 15 years ago GSM networks became quite widespread across these countries. In the meantime US kept on using analog cell phones. Motorola did not even believe that digital cell phone had much of a future, so it decided to stay away from this market, a decision which almost bankrupted the company.
US started rolling out digital service only about 10 years ago. As US government does not like to dictate private companies how to conduct their business, they sold the spectrum and put down some basic ground rules, but for the most part they let the service providers use any network they wished. For one reason or another, these providers decided go with about 4 different standards at first. Quite a few companies went with GSM, AT&T picked a similar, but incompatible TDMA (IS=136?) standard, Nextel went with a proprietary standard they called iDEN and Sprint and Verizon went with CDMA, a radically different standard (IS-95) designed by Qualcomm. At the time, other big companies were very skeptical, so Qualcomm had to not only develop the underlying communication standards, but manufacture cell phones and the electronics for the cell towers. However, once the system proved itself, everybody started moving in that direction. Even the upcoming 3G system for these GSM networks, called UMTS, use a variant of CDMA technology.
CDMA is a more complicated standard compared to GSM, but it allows the providers to cram more users into each cell, it is supposedly cheaper to maintain and more flexible in some respects. However, anybody in that boat has to pay hefty royalties to Qualcomm, dampening its popularity. While creating UMTS, GSM standards bodies did everything they could to avoid using Qualcomm patents to avoid these payments. However, I don't know how successful they got in these efforts.
Even though Europeans here on these forums like to gloat that US did not join the worldwide standard, that we did not play along, that ours is a hodge podge of incompatible systems; without the freedom to try out different standards, CDMA would not have the opportunity to prove its feasibility and performance. In the end, the rest of the world is also reaping the benefits through UMTS/WCDMA.
Of course, not using the same standards as everybody else has its own price. The components of CDMA cell phones cost more and the system itself is more complicated, so CDMA versions of cell phones hit the market six months to a year after their GSM counterparts, if at all. The infrastructure cost of a rare system is higher as well, so AT&T had to rip apart its network to replace it with GSM version about five years after rolling it out. Sprint is probably going to convert Nextel's system in the near future as well.
I hope this answers your question.
NY Guitarist
Apr 5, 07:36 PM
Also, I'm waiting for the RED Scarlet camera to hit the market, and have heard speculation that RED and Apple will release a new highly efficient compression codec based on RED's Redcode called REDRay.
The speculation is that REDRay will be used for everything from 4K DCP playback in movie theaters to a download/streaming version that will be usable for buying up to 4K movies through iTunes.
RED hired plugin developer Graeme Nattress awhile ago and he has been pushing the REDcode science forward with excellent results.
The speculation is that REDRay will be used for everything from 4K DCP playback in movie theaters to a download/streaming version that will be usable for buying up to 4K movies through iTunes.
RED hired plugin developer Graeme Nattress awhile ago and he has been pushing the REDcode science forward with excellent results.
jhedges3
Aug 11, 02:57 PM
See now that is something I never understood, how the cell service can be so poor in a place like NYC, yet I was making calls on my CDMA phone in the middle of Wyoming this summer. In fact, there are few places in very unpopulated midwest and west that you can't get a decent signal at least with a CDMA phone. People that come here with GSM are out of luck anywhere except metro areas.
New York has more of something than Wyoming, which is buildings. These buildings make it more difficult for signal to get to people, I think. For whatever reasons CDMA seems to work much better here than GSM.
New York has more of something than Wyoming, which is buildings. These buildings make it more difficult for signal to get to people, I think. For whatever reasons CDMA seems to work much better here than GSM.
freebooter
Nov 28, 09:39 PM
Just greed, plain and simple.
Porchland
Aug 7, 04:11 PM
Looks very nice. Spaces will become a "how did we live without this?" feature as expose already has.
Does anyone know when we can expect a video of the WWDC to be uploaded??:confused:
I can't really tell how Spaces will work the Expose.
Apple's Leopard Sneak Peak says:
Configure your Spaces by visiting the Dashboard and Exposé preference pane in System Preferences. Add rows and columns until you have all the desktop real estate you need. Arrange your Spaces as you see fit, then assign what function keys you want to control them. You can also lock specific applications to specific Spaces, so you’ll always know where, say, Safari or Keynote is at all times.
I could the simulteneous use of both getting a little confusing.
My main concern overall about Leopard is that feature creep is going to cut into ease of use.
Does anyone know when we can expect a video of the WWDC to be uploaded??:confused:
I can't really tell how Spaces will work the Expose.
Apple's Leopard Sneak Peak says:
Configure your Spaces by visiting the Dashboard and Exposé preference pane in System Preferences. Add rows and columns until you have all the desktop real estate you need. Arrange your Spaces as you see fit, then assign what function keys you want to control them. You can also lock specific applications to specific Spaces, so you’ll always know where, say, Safari or Keynote is at all times.
I could the simulteneous use of both getting a little confusing.
My main concern overall about Leopard is that feature creep is going to cut into ease of use.
ciTiger
Apr 27, 08:58 AM
It seems a good argument to me.
But saying they are going to "issue" an update specifically for fixing related things seems fishy....
But saying they are going to "issue" an update specifically for fixing related things seems fishy....
fivepoint
Apr 27, 01:54 PM
First off, before the ignorant attacks begin, no I'm not a birther. I'm personally of the opinion that he was born in America and generally share the president's feelings that this is a giant waste of time.
Now... to the issue at hand: I'm not an expert in layout/graphic-design, but can someone please tell me why the PDF Certificate of Live Birth has Illustrator layers? If it's a scan, shouldn't it just be a single image, jpg, pdf, png, etc. consisting of a bunch of pixels and not layers? I downloaded the file from whitehouse.gov, opened it in Adobe Illustrator, and after releasing the layers slide the black text around seperate from the green/white background. I'm not sure what's going on here, can someone shed some light on the issue?
I'm assuming there's a logical explanation, any graphic artists here want to update the rest of us?
http://farm6.static.flickr.com/5026/5662168856_0e95c82cc7_b.jpg
http://farm6.static.flickr.com/5066/5661600471_9ebebdaf36_b.jpg
Now... to the issue at hand: I'm not an expert in layout/graphic-design, but can someone please tell me why the PDF Certificate of Live Birth has Illustrator layers? If it's a scan, shouldn't it just be a single image, jpg, pdf, png, etc. consisting of a bunch of pixels and not layers? I downloaded the file from whitehouse.gov, opened it in Adobe Illustrator, and after releasing the layers slide the black text around seperate from the green/white background. I'm not sure what's going on here, can someone shed some light on the issue?
I'm assuming there's a logical explanation, any graphic artists here want to update the rest of us?
http://farm6.static.flickr.com/5026/5662168856_0e95c82cc7_b.jpg
http://farm6.static.flickr.com/5066/5661600471_9ebebdaf36_b.jpg
ciTiger
Apr 27, 08:58 AM
It seems a good argument to me.
But saying they are going to "issue" an update specifically for fixing related things seems fishy....
But saying they are going to "issue" an update specifically for fixing related things seems fishy....
Shadow
Jul 14, 07:02 PM
Why be limited to 2? Why not 3, 4, 5 or 6? I also want quad-10GHz Woodcrests with 20GB of DDR6-8000 RAM, with 2exobytes of HDD space. AND room to upgrade. Oh, and quad 7900GTXs. For �1000.
Ok, that never gonna happen, but it illustrates the point that people want more and more for less and less.
Ok, that never gonna happen, but it illustrates the point that people want more and more for less and less.
0815
Apr 27, 08:17 AM
I actually thought looking at a history of where my phone has been on a map was kinda cool. Bummer.
Yes - I was hoping when they 'fix' this that they will leave an option in the settings to keep that data - I absolutely enjoyed browsing through the data and revisit my trips that way (and sometimes wondering 'what the hack did I do in that location?)
Yes - I was hoping when they 'fix' this that they will leave an option in the settings to keep that data - I absolutely enjoyed browsing through the data and revisit my trips that way (and sometimes wondering 'what the hack did I do in that location?)
jaydub
Sep 18, 11:09 PM
Is it happening on a tuesday, perchance? :D
bushido
Apr 12, 08:44 AM
just got sold to the HTC Sensation LOVE IT! u could wish the 5th would get half of it
0racle
Mar 31, 04:31 PM
Oh, then I can take the Honeycomb source code and do whatever I want with it?
Oh, wait, I can't? Then how doesn't this make Android 'closed source'?
Sure, just buy a Honeycomb powered device. Until then Google has no legal requirement to let you have the GPL portions of source. As for the rest, it is licensed under an Apache License, which does not require Google release the source at all but does allow a user to modify and redistribute what they do have.
FOSS does not mean they have to put the source out in the open.
Oh, wait, I can't? Then how doesn't this make Android 'closed source'?
Sure, just buy a Honeycomb powered device. Until then Google has no legal requirement to let you have the GPL portions of source. As for the rest, it is licensed under an Apache License, which does not require Google release the source at all but does allow a user to modify and redistribute what they do have.
FOSS does not mean they have to put the source out in the open.
yg17
Mar 4, 07:51 AM
Invalid because it endorses something that could cause the collapse of society
Source?
:rolleyes:
Source?
:rolleyes:
bugfaceuk
Apr 10, 07:08 AM
anything less than the following will be a huge disappointment:
- touch-based editing release together with a huge "iPad"/editing board (probably connected to the main computer with Thunderbolt)
- professional features intact and developed
- integrates nicely with DI systems such as DaVinci
best,
jon m.
Faster horses.
- touch-based editing release together with a huge "iPad"/editing board (probably connected to the main computer with Thunderbolt)
- professional features intact and developed
- integrates nicely with DI systems such as DaVinci
best,
jon m.
Faster horses.
ksz
Sep 20, 04:11 PM
The only real downside I see is that Intel Macs are unlikely to hold their value anywhere near as well as the PPC line did due to the quicker changes we'll see now.
I keep systems til they fall apart, pretty much, but there's quite a few on the various forums who say they always buy and sell 2-3 years later to upgrade.
I should have been more thorough in my previous reply. What I really like about these frequent updates are the following:
1. The motherboard has socketed processors (except for the laptops).
2. Even though Intel is updating processors every 6 months or so, the motherboard and chipset seem to support the next processor version.
Yonah can be replaced with Merom.
Woodcrest can be replaced with Clovertown.
Your computer does not become obsolete in 6 months. Instead, it gains new life if you decide that you need the new processor.
Every 12 to 18 months or so a new chipset may become necessary. Only then does your computer lose the upgrade potential. If you buy Merom, you may not be able to upgrade to the next processor. Likewise if you buy Clovertown. New chipsets will be required beyond Merom and Clovertown.
In any event, this is based on trailing history of just 1 year. Future events may unfold differently.
I keep systems til they fall apart, pretty much, but there's quite a few on the various forums who say they always buy and sell 2-3 years later to upgrade.
I should have been more thorough in my previous reply. What I really like about these frequent updates are the following:
1. The motherboard has socketed processors (except for the laptops).
2. Even though Intel is updating processors every 6 months or so, the motherboard and chipset seem to support the next processor version.
Yonah can be replaced with Merom.
Woodcrest can be replaced with Clovertown.
Your computer does not become obsolete in 6 months. Instead, it gains new life if you decide that you need the new processor.
Every 12 to 18 months or so a new chipset may become necessary. Only then does your computer lose the upgrade potential. If you buy Merom, you may not be able to upgrade to the next processor. Likewise if you buy Clovertown. New chipsets will be required beyond Merom and Clovertown.
In any event, this is based on trailing history of just 1 year. Future events may unfold differently.
scottlinux
Sep 13, 11:41 AM
Blender http://www.blender.org/ can uses 8 cores.
Evangelion
Sep 19, 06:17 AM
Key word being DESKTOPS.
Again: NT was widely used on desktops. Maybe not by your Average Joe, but LOTS of people used it on the desktop. I used NT-workstation back when I studied, my friend used NT on his PC, lots and lots of companies ran NT, the list goes on. Hell, there were propably an order of magnitude more NT-desktops out there that there were Macs of any type!
I still don't know personally anyone who uses OS X. Does that mean that no-one uses it?
MP machines were server based long before they were included in desktops. I'd like to see where people had dual Xeon based DESKTOPS 'cause I've never seen it.
There were plenty of people running SMP-systems. I personally knew two guys who had SMP-PC's. Just because you haven't seen anyone use one, does not mean that they weren't there.
Again: NT was widely used on desktops. Maybe not by your Average Joe, but LOTS of people used it on the desktop. I used NT-workstation back when I studied, my friend used NT on his PC, lots and lots of companies ran NT, the list goes on. Hell, there were propably an order of magnitude more NT-desktops out there that there were Macs of any type!
I still don't know personally anyone who uses OS X. Does that mean that no-one uses it?
MP machines were server based long before they were included in desktops. I'd like to see where people had dual Xeon based DESKTOPS 'cause I've never seen it.
There were plenty of people running SMP-systems. I personally knew two guys who had SMP-PC's. Just because you haven't seen anyone use one, does not mean that they weren't there.
�algiris
Mar 26, 02:25 AM
Been on Lion for the past month and I can't see myself going back to Snow Leopard.
Same here. Buggy as hell, but i like what i see.
Same here. Buggy as hell, but i like what i see.
No comments:
Post a Comment