Subscribe to
Posts
Comments

Archive for February, 2009

MIT Museum crowd-sources exhibition

MIT will be 150 years old in two years. So, the MIT Museum (where you can see Judith Donath’s arresting and provocative info-overwhelm installation, which opened last night) is asking the public to nominate objects to put on display. The nominations themselves will remain online forever after as a very different sort of permanent display.

[Tags: ]

[berkman] Peter Suber on the future of open access

Peter Suber, Research Prof. of Philosophy at Earlham College, a visiting fellow at Yale Law’s Information Society Project, and blogger of open access news, is giving a full-house lecture at Harvard, sponsored by the Berkman Center. [Note: I’m live blogging, making mistakes, leaving things out, paraphrasing ineptly, etc. POSTED WITHOUT PROOFING or even with a basic re-reading. Speed over accuracy. Welcome to the Web :(]

Peter says he’s going to assume that we know what open access is, etc. But he does want to define Green Open Access (= open access through a repository) and Gold OA (= OA through a journal). There’s also Gratis OA (free of charge but may be licensing restrictions) and Libre OA (free of charge and free of licensing restrictions).

Peter says he doesn’t know the future of OA. He likes Alan Kaye’s comment that the future is easier to make than commit. He’s going to talk about 12 cross-over points in OA, in rough order of when they might occur:

1. For-pay journals allow green OA. About 63% of these journals already do this.

2. OA books:: When ther are more gratis OA online than in the average university library. We crossed this a couple of years ago. “The permission problem is harder than digitization.” The next cross over point here is getting more libre OA books online, which we are quite a distance from.

3. Funder policies: “When most publicly-funded research is subject to OA mandates.” This seems to be spreading, Peter says. Today, 32 public funders and more than 3 private funders have OA mandates.

4. Green OA deposits: “When most new peer-reviewed manuscripts are self-archived when accepted for publication.” In particle physics, this happens routinely. If 20% of researchers publish 80% of the articles, we could reach cross-over fairly quickly in some fields.

5. Author understanding: “When most publishing researchers have an accurate understanding of OA.” This is happening, but not bery quickly.

6. University repositories: “When most universities have institutional repositories,” individually or as part of a consortium. This is happening slowly. In the absence of a universal repository, every university ought to have one. Universities will get to this point more slowly than funders because they move more slowly than funders. And we ought to ask why. Aren’t universities interests in line with OA?

Libre gold OA: “When most OA journals are libre OA.” Most OA journals are still merely gratis, but curb copying to drive traffic to their site. This crossover could happen overnight if the journals understood the issues. They’d lose a little traffic, but noithing else. There are grounds for optimism: Open Access Scholarly Publishers Association is an assoc of OA journal publishers and it requires libre gold OA. The SPARC Europe program sets standards for what a good OA journal is, and it recommends CreativeCommons attribution licenses. These two orgs are helpful because there’s no topdown org defining OA, so we rely on bottom up orgs like these two to set the standards.

8. Journal backfiles: “When most TA journals have OA backfiles.” This is expensive to do. Google will do it, but Google’s terms are difficult: They don’t give the journal a copy of the digital files. (Libraries do get copies of the files of the books they let Google scan.) The OCA focuses on public domain literature. “Once digitized, the benefits of increased visibility and citations should outweigh the trickle of revenue.” Journals make most of their money from new issues, so having greater presence should help. In physics, almost 100% of articles are available OA but the publishers can’t see any dip in subscriptions.

9. Author addenda: “When most new research is covered by author addenda” (i.e., additions that grant OA permission, tacked onto standard publisher-author contracts). Now there are few adopters. It’d be good to standardize these. The cross over wil come when universities or funders require it. If enough journals allow green OA, that’d make addenda unnecessary.

10. University policies: “When most university research is subject to university-level OA mandates.” Today, 27 universities and 4 depts have these mandates. It’d help to have the largest/most productive universities move on this first.

11. OA journals: “When most peer-reviewed journals are OA.” “I don’t expect this for a long time.” Now 15% are OA. Progress is slow, but there is progress. High prestige journals are likely to hold out for a long time.

Libre green OA: “When most green OA is libre OA.” Today, only a small fraction is libre OA because most OA repositories depend on permission from publishers. UKPMC Funders Group demands green libre. We will reach the cross-over “when it’s safe.” Harvard has taken the lead on this, Peter says, and it will spread as another large university takes this step, then another one … “It becomes self-fulfilling leadership.”

Q: Are we stuck with the Sonny Bono copyright extension act?
A: Yes. All copyright reform in the past few decades have been in the wrong direction. And it’s very hard to roll back copyright terms. The only silver lining is that when we have a consenting partner, we can bypass copyright via contract. The problem is that they’re not the default.

Q: Under libre OA, how are our scholarly attributions protected?
A: It’s a range. One end is public domain, which does not preserve attribution. But all the CreativeCommons licenses preserve attribution. Most scholars don’t want public domain; they want CC-attribution.”Creative Commons attribution license provides everything a scholar could want.”

Q: [me] Conyers!
A: The Conyers bill would tell agencies not to require OA for works they fund. [I’ve put this badly.] The OA advocates are fighting it, but the agencies affected are not yet. I think Conyers is serious about it. I think he introduced it early because we don’t have a Sect’y of Health and Human Services or of NIH. Conyers may be fighting a turf battle. [Paraphrasing!] “He’s motivated primarily to protect the jurisdiction of his committee.” Peter thinks it won’t pass, but it might be introduced into another bill. “We’d like to spread the NIH policy to the rest of the government.”

Q: Is the economic downturn accelerating the adoption of OA?
A: The NIH just got $10B in the stimulus, which means there will be more and more OA articles. NSF also, but not as much because NSF requires OA for reports generated by those they fund [may not have gotten this right]. But, Peter thinks the downturn strengthens the case for OA. Libraries are going to be canceling subscriptions. And the Stimulus’ emphasis on green research will be more valuable if it’s OA. Open access to research amplifies its value.

Q: [jpalfrey] You’ve noticed there’s no OSF equivalent for OA. But I’d argue that the people in this room — librarians — are your OA OSF. What do you say to these librarians to advance our common cause?
A: Librarians are among the most important allies in the OA movement. But put all the allies together and you still don’t have OSF. Libraries should be sending letters against the Conyers bill. When you negotiate subscriptions you should negotiate the right to pur articles from your authors into an OA repository. Libraries are the only buyers of peer-reviewed journals. When you’re the only buyer, you can dictate your terms, subject to anti-trust. Obama says that we have the right to demand transformation from the banks we’re saving. Librarians can do the same thing for journals. Journals are not serving all of our interests and are acting against other interests. Use your bargaining power. Get the right of self-archive, and, when the time is right, get the right of libre self-archive. Network with one another when you launch repositories. And, btw, every school with an enlightened OA policy had librarians in the head of the charge.

Q: Can you give an example of an archive that works?
A: Universities that have mandatory language still have to supplement the language with incentives and education. As you go from unmandated to mandated, it goes from 15% toward 100%. (15% is the average for voluntary, spontaneous archives.) It works best in the Dutch universities that let the “cream” of the article rise to the top. Every week they feature good work in public. This gets academics to archive their work without a mandate.

Q: Might universities work with publishers collaboratively to create new business models?
A: Publishers differ in their attitudes toward OA. Some are experimenting in good faith. Some, in bad faith. Some who do OA are actively lobbying for the Conyers bill. Librarians understand the scholarly landscape better than publishers and could educate publishers. Society publishers [i.e., societies that publish] could be told that they’re threatened not by OA but by the “big deal” that brings in academic journals.

Q: Is Springer’s taking over of BioMed Central a good thing?
A: Yes. BMC is for-profit. BMC was the world’s largest OA publisher. Now Springer is. Springer says that OA is a sustainable part of their bsiness. My reading is that Springer is preparing for an OA future. [Tags: ]

James Boyle on keeping public science open to the public

The Financial Times yesterday ran a terrific op-ed by James Boyle, explaining the ridiculousness of the Conyers bill “that would eviscerate public access to taxpayer funded research.” The op-ed is written in Jamie’s light-hearted-yet-penetrating style, and should be read simply for that reason — preeminent legal scholars of copyright are not supposed to be entertaining.

Then, at the end, he adds an argument that I think is crucial because it addresses the Internet’s effect on knowledge and authority:

Think about the Internet. You know it is full of idiocy, mistake, vituperation and lies. Yet search engines routinely extract useful information for you out of this chaos. How do they do it? In part, by relying on the network of links that users generate. If 50 copyright professors link to a particular copyright site, then it is probably pretty reliable.

Where are those links for the scientific literature? Citations are one kind of link; the hyperlink is simply a footnote that actually takes you to the desired reference. But where is the dense web of links generated by working scientists in many disciplines, using semantic web technology and simple cross reference electronically to tie together literature, datasets and experimental results into a real World Wide Web for science? The answer is, we cannot create such a web until scientific articles come out from behind the publishers’ firewalls….

[Tags: ]

Is Wikipedia getting too hard? A random sampling

A couple of weeks ago, I happened to come across a few Wikipedia articles that struck me as too hard. I started getting worried that Wikipedia’s constant review process was resulting in articles inching up the Technical Accuracy pole while slipping down the Intelligibility for Non-Experts pole.

So, I checked in on a handful of articles, looking particularly at the introductory paragraphs Here are the examples, minus the many hyperlinks. (My premise is that you shouldn’t have to click on a hyperlink to figure out what the intro is talking about.)

Please note that this is an entirely unscientific, non-significant sampling. Still, the results were that I’m pretty much reassured. I think these generally are quite understandable intros. I wonder what your experience has been.

Fibonacci number

In mathematics, the Fibonacci numbers are a sequence of numbers named after Leonardo of Pisa, known as Fibonacci (a contraction of filius Bonaccio, "son of Bonaccio"). Fibonacci’s 1202 book Liber Abaci introduced the sequence to Western European mathematics, although the sequence had been previously described in Indian mathematics.[2][3]

The first number of the sequence is 0, the second number is 1, and each subsequent number is equal to the sum of the previous two numbers of the sequence itself, yielding the sequence 0, 1, 1, 2, 3, 5, 8, etc. In mathematical terms, it is defined by the following recurrence relation:

Iambic pentameter

Iambic pentameter is a type of meter that is used in poetry and drama. It describes a particular rhythm that the words establish in each line. That rhythm is measured in small groups of syllables; these small groups of syllables are called "feet". The word "iambic" describes the type of foot that is used. The word "pentameter" indicates that a line has five of these "feet".

Entropy

In many branches of science, entropy is a measure of the disorder of a system. The concept of entropy is particularly notable as it is applied across physics, information theory and mathematics.

In thermodynamics (a branch of physics), entropy, symbolized by S,[3] is a measure of the unavailability of a system’s energy to do work.[4][5] It is a measure of the disorder of molecules in a system, and is central to the second law of thermodynamics and to the fundamental thermodynamic relation, both of which deal with physical processes and whether they occur unexpectedly. Spontaneous changes in isolated systems occur with an increase in entropy. Unexpected changes tend to average out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how great the unexpected changes are.

Uncertainty principle

In quantum physics, the Heisenberg uncertainty principle states that the values of certain pairs of conjugate variables (position and momentum, for instance) cannot both be known with arbitrary precision. That is, the more precisely one property is known, the less precisely the other can be known. This is not a statement about the limitations of a researcher’s ability to measure particular quantities of a system, but rather about the nature of the system itself.

In quantum mechanics, the particle is described by a wave. The position is where the wave is concentrated and the momentum, a measure of the velocity, is the wavelength. The position is uncertain to the degree that the wave is spread out, and the momentum is uncertain to the degree that the wavelength is ill-defined.

Black hole

According to Einstein’s theory of general relativity, a black hole is a region of space in which the gravitational field is so powerful that nothing, including electromagnetic radiation (e.g. visible light), can escape its pull after having fallen past its event horizon. The term derives from the fact that absorption of visible light renders the hole’s interior invisible, and indistinguishable from the black space around it.

Despite its invisible interior, a black hole may reveal its presence through interaction with matter orbiting the event horizon. For example, a black hole may be perceived by tracking the movement of a group of stars that orbit its center. Alternatively, one may observe gas (from a nearby star, for instance) that has been drawn into the black hole. The gas spirals inward, heating up to very high temperatures and emitting large amounts of radiation that can be detected from earthbound and earth-orbiting telescopes.[2][3] Such observations have resulted in the general scientific consensus that—barring a breakdown in our understanding of nature—black holes do exist in our universe.[4]

Hawking radiation

Hawking radiation (also known as Bekenstein-Hawking radiation) is a thermal radiation with a black body spectrum predicted to be emitted by black holes due to quantum effects. It is named after the physicist Stephen Hawking who provided the theoretical argument for its existence in 1974, and sometimes also after the physicist Jacob Bekenstein who predicted that black holes should have a finite, non-zero temperature and entropy. Hawking’s work followed his visit to Moscow in 1973 where Soviet scientists Yakov Zeldovich and Alexander Starobinsky showed him that according to the quantum mechanical uncertainty principle, rotating black holes should create and emit particles.[1] The Hawking radiation process reduces the mass of the black hole and is therefore also known as black hole evaporation.

Because Hawking radiation allows black holes to lose mass, black holes that lose more matter than they gain through other means are expected to dissipate, shrink, and ultimately vanish. Smaller micro black holes (MBHs) are predicted to be larger net emitters of radiation than larger black holes, and to shrink and dissipate faster.

Higgs boson

In particle physics, the Higgs boson is a massive scalar elementary particle predicted to exist by the Standard Model.

The Higgs boson is the only Standard Model particle that has not yet been observed. Experimental detection of the Higgs boson would help explain how massless elementary particles can have mass. More specifically, the Higgs boson would explain the difference between the massless photon, which mediates electromagnetism, and the massive W and Z bosons, which mediate the weak force. If the Higgs boson exists, it is an integral and pervasive component of the material world.

Deontological ethics

Deontological ethics or deontology (from Greek δέον, deon, "obligation, duty"; and -λογία, -logia) is an approach to ethics that focuses on the rightness or wrongness of intentions or motives behind action such as respect for rights, duties, or principles, as opposed to the rightness or wrongness of the consequences of those actions.[1]

It is sometimes described as "duty" or "obligation" based ethics, because deontologists believe that ethical rules "bind you to your duty".[2] The term ‘deontological’ was first used in this way in 1930, in C. D. Broad’s book, Five Types of Ethical Theory.[3]

Phenomenology (philosophy)

Phenomenology is a philosophy or method of inquiry based on the premise that reality consists of objects and events as they are perceived or understood in human consciousness and not of anything independent of human consciousness. Developed in the early years of the twentieth century by Edmund Husserl and a circle of followers at the universities of Göttingen and Munich in Germany, phenomenological themes were taken up by philosophers in France, the United States, and elsewhere, often in contexts far removed from Husserl’s work.

"Phenomenology" comes from the Greek words phainómenon, meaning "that which appears," and lógos, meaning "study." In Husserl’s conception, phenomenology is primarily concerned with making the structures of consciousness, and the phenomena which appear in acts of consciousness, objects of systematic reflection and analysis. Such reflection was to take place from a highly modified "first person" viewpoint, studying phenomena not as they appear to "my" consciousness, but to any consciousness whatsoever. Husserl believed that phenomenology could thus provide a firm basis for all human knowledge, including scientific knowledge, and could establish philosophy as a "rigorous science".

Sarcoidosis

Sarcoidosis, also called sarcoid (from the Greek sarx, meaning "flesh") or Besnier-Boeck disease, is a multisystem disorder characterized by non-caseating granulomas (small inflammatory nodules). It most commonly arises in young adults. The cause of the disease is still unknown. Virtually any organ can be affected; however, granulomas most often appear in the lungs or the lymph nodes. Symptoms usually appear gradually but can occasionally appear suddenly. The clinical course generally varies and ranges from asymptomatic disease to a debilitating chronic condition that may lead to death .

Cascading style sheets

[This is what it said before I edited it, to try to make it a bit more understandable to those who don’t already know about the topic.]

Cascading Style Sheets (CSS) is a stylesheet language used to describe the presentation of a document written in a markup language. Its most common application is to style web pages written in HTML and XHTML, but the language can be applied to any kind of XML document, including SVG and XUL.

CSS can be used locally by the readers of web pages to define colors, fonts, layout, and other aspects of document presentation. It is designed primarily to enable the separation of document content (written in HTML or a similar markup language) from document presentation (written in CSS). This separation can improve content accessibility, provide more flexibility and control in the specification of presentation characteristics, and reduce complexity and repetition in the structural content (such as by allowing for tableless web design). CSS can also allow the same markup page to be presented in different styles for different rendering methods, such as on-screen, in print, by voice (when read out by a speech-based browser or screen reader) and on Braille-based, tactile devices. CSS specifies a priority scheme to determine which style rules apply if more than one rule matches against a particular element. In this so-called cascade, priorities or weights are calculated and assigned to rules, so that the results are predictable.

Markup language

[This was how it began before I cleaned it up slightly.]

A markup language is an artificial language using a set of annotations to text that give instructions regarding the structure of text or how it is to be displayed. Markup languages have been in use for centuries, and in recent years have been used in computer typesetting and word-processing systems.

A well-known example of a markup language in use today in computing is HyperText Markup Language (HTML), one of the most used in the World Wide Web. HTML follows some of the markup conventions used in the publishing industry in the communication of printed work among authors, editors, and printers.

RNA

Ribonucleic acid (RNA) is a type of molecule that consists of a long chain of nucleotide units. Each nucleotide consists of a nitrogenous base, a ribose sugar, and a phosphate. RNA is very similar to DNA, but differs in a few important structural details: in the cell, RNA is usually single-stranded, while DNA is usually double-stranded; RNA nucleotides contain ribose while DNA contains deoxyribose (a type of ribose that lacks one oxygen atom); and RNA has the base uracil rather than thymine that is present in DNA.

RNA is transcribed from DNA by enzymes called RNA polymerases and is generally further processed by other enzymes. RNA is central to the synthesis of proteins. Here, a type of RNA called messenger RNA carries information from DNA to structures called ribosomes. These ribosomes are made from proteins and ribosomal RNAs, which come together to form a molecular machine that can read messenger RNAs and translate the information they carry into proteins. There are many RNAs with other roles – in particular regulating which genes are expressed, but also as the genomes of most viruses.

Twelve-tone scale

The chromatic scale is a musical scale with twelve pitches, each a semitone or half step apart. "A chromatic scale is a nondiatonic scale consisting entirely of half-step intervals," having, "no tonic," due to the symmetry or equal spacing of its tones[1].

[Image of “Chromatic scale on C: full octave ascending and descending”]

The most common conception of the chromatic scale before equal temperament was the Pythagorean chromatic scale, which is essentially a series of eleven 3:2 perfect fifths. The twelve-tone equally tempered scale tempers, or modifies, the Pythagorean chromatic scale by lowering each fifth slightly less than two cents, thus eliminating the Pythagorean comma of approximately 23.5 cents. Various other temperaments have also been proposed and implemented.

The term chromatic derives from the Greek word chroma, meaning color. Chromatic notes are traditionally understood as harmonically inessential embellishments, shadings, or inflections of diatonic notes.

I find the above pretty much incomprehensible.

Semiotics

Semiotics, also called semiotic studies or semiology, is the study of sign processes (semiosis), or signification and communication, signs and symbols, both individually and grouped into sign systems. It includes the study of how meaning is constructed and understood.

One of the attempts to formalize the field was most notably led by the Vienna Circle and presented in their International Encyclopedia of Unified Science, in which the authors agreed on breaking out the field, which they called "semiotic", into three branches: …

Designated hitter rule

In baseball, the designated hitter rule is the common name for Major League Baseball Rule 6.10[1], an official position adopted by the American League in 1973 that allows teams to designate a player, known as the designated hitter (abbreviated DH), to bat in place of the pitcher. Since then, most collegiate, amateur, and professional leagues have adopted the rule or some variant; MLB’s National League and Nippon Professional Baseball’s Central League are the most prominent professional leagues that have not.

Derivatives (finance)

Derivatives are financial contracts, or financial instruments, whose values are derived from the value of something else (known as the underlying). The underlying on which a derivative is based can be an asset (e.g., commodities, equities (stocks), residential mortgages, commercial real estate, loans, bonds), an index (e.g., interest rates, exchange rates, stock market indices, consumer price index (CPI) — see inflation derivatives), or other items (e.g., weather conditions, or other derivatives). Credit derivatives are based on loans, bonds or other forms of credit.

The main types of derivatives are forwards, futures, options, and swaps.

Derivatives can be used to mitigate the risk of economic loss arising from changes in the value of the underlying. This activity is known as hedging. Alternatively, derivatives can be used by investors to increase the profit arising if the value of the underlying moves in the direction they expect. This activity is known as speculation.

Because the value of a derivative is contingent on the value of the underlying, the notional value of derivatives is recorded off the balance sheet of an institution, although the market value of derivatives is recorded on the balance sheet.

This intro doesn’t do a good job explaining derivatives or hedges, but the article itself is actually fairly clear.

[Tags: ]

Government mandates stimulus outlays be RSS’ed

Aaron Swartz reports that the stimulus bill requires that government agencies use RSS to report on the stimulus money they disperse, so that those who are interested can get automatically updated. And those who are interested will include institutions and individuals aggregating that information so that the alarms can sound … and, we hope, the bouquets can shower down.

[Tags: ]

Open Access: Half step forward, big possible step back

Boston University has decided to set up an open access archive for scholarly work produced there. Yay. This seems to stop short from mandating that academics there are required to put a copy of their published work into the archive, but it’s a good step.

On the other hand, the Open Access Blog reports:

Congressional Representative John Conyers (D-MI) has re-introduced a bill (HR801) that essentially would negate the NIH policy concerning depositing research in OA repositories.

Here are the first three points in a letter posted by Jennifer McLennan:

H.R. 801 is designed to amend current copyright law and create a new category of copyrighted works (Section 201, Title 17). In effect, it would:

1. Prohibit all U.S. federal agencies from conditioning funding agreements to require that works resulting from federal support be made publicly available if those works are either: a) funded in part by sources other than a U.S. agency, or b) the result of “meaningful added value” to the work from an entity that is not party to the agreement.

2. Prohibit U.S. agencies from obtaining a license to publicly distribute, perform, or display such work by, for example, placing it on the Internet.

3. Stifle access to a broad range of federally funded works, overturning the crucially important NIH Public Access Policy and preventing other agencies from implementing similar policies.

Here’s a draft letter opposing it.

[Tags: ]

Alltop skims the surface for us

There’s nothing wrong with scratching the surface if that’s what itches.

Alltop.com is a surface skimmer, and it looks quite useful. So, if you’re interested in one of the topics they cover, such as, say, cloud computing, you could click to see a compact listing of what bloggers and others are saying about it. Or GLBT. Or Obama. Or what the top stories in the LA Times are.

The weakness, as well as the strength, is that Alltop decides on the topics and sources (although the sources include searches of Google News, Technorati, and other aggregators). So, a search for “stimulus” turns up nothing because it is not one of the topics, although it is certainly mentioned in some of the content Alltop rounds up.

The very amusing “about” page reports that the decision about topics and sources of topics is made by humans and is “highly subjective and judgmental.” It helps that the site wants to push us out of our comfort zone by including sources we might otherwise scorn. But it’s important to keep in mind that — despite inevitable appearances — Alltop is essentially a magazine that reflects the interests and values of its editors. Nothing wrong with that at all. On the contrary. It’s just important that we not mistake Alltop for anything else. (The following is more concerning to me: “We take care of our friends. If sites or blogs help us, we help them,” as a criterion for inclusion. Is that really the best policy for the sake of us readers?)

Anyway, Alltop looks like a useful way to track the topics that it and you care about. [Tags: ]

SpokenWord.org aggregates spoken words

Douglas Kaye, founder of IT Conversations and the Conversations Network, has launched SpokenWord.org. Here’s part of the announcement:

There are perhaps millions of audio and video spoken-word
recordings on the Internet. Think of all those lectures,
interviews, speeches, conferences, meetings, radio and TV
programs and podcasts. No matter how obscure the topic,
someone has recorded and published it on line.

But how do you find it?

SpokenWord.org is a new free on-line service that helps you
find, manage and share audio and video spoken-word
recordings, regardless of who produced them or where
they’re published. All of the recordings in the
SpokenWord.org database are discovered on the Internet and
submitted to our database by members like you.

This is another public-spirited work from a public-spirited guy who has assembled and inspired a public-spirited collective. [Disclosure: I’m on the board of advisers.]

[Tags: ]

Fourier Tweet Transforms

Me, on Twitter:

Challenge: Explain Fourier Transforms, w/o math, to a Humanities major (me), more clearly than http://tinyurl.com/27n3g … in 1 tweet?

(Note: I’ve corrected the URL, which points to the Wikipedia article, and that had an extra character in it.)

The responses, in the order received:

jonathanweber @dweinberger Looking at a periodic signal in time, the Fourier transform explains it in terms of what mix of frequencies is present. Helps?

DarrylParker @dweinberger the better question is why does a humanities major like you need to understand it? ;)v

cparasat @dweinberger It’s just adding waves to other waves.v

DarrylParker @dweinberger simpler overview of the Fourier series, but still a bit mathematical – http://tinyurl.com/chw5pp

fantomplanet @dweinberger Fourier Xformations are like ironing your shirt. It smooths things out.

JoeAndrieu @dweinberger FTs take a signal in time and represent it as a series of frequencies. Makes audio signal look like an equalizer graph.v

ts_eliot @dweinberger in 1 tweet?! impossible

fanf @dweinberger the FT splits a signal into separate frequencies, like a prism splits light

fjania @dweinberger – it shows us which, and how much of each, simple sine waves we can add together to reconstruct the signal we’re transforming.

ricklevine @dweinberger Hm. Fourier transforms convert a bunch of sample measurements (audio, seismic data, etc) into frequency info: http://is.gd/iQP3

ricklevine @dweinberger Of course there’s a lot more to it. Try: it’s a way of taking seemingly rndm data and fitting a curve to it, enabling analysis.

fields @dweinberger Things you don’t understand can be expressed in smaller equivalent pieces of things you don’t understand.

IanYorston @dweinberger “Explain Fourier Transforms to a Humanities major”. Smart maths breaks large constructs down into small things loosely joined.

mtobis @dweinberger: your tinyurl fails. Fourier transform an audio signal and get back an amplitude for each pure tone; no information is lost.

vasusrini @dweinberger Sound=Vibrating Air.Bee Buzz & dog bark=diff. frequency signatures. Ear hears all at once & sorts it. FT is the m/c equivalent.

chichiri @dweinberger just say without it you wouldn’t have JPEGs, enough said ;)

artficlinanity @dweinberger Every signal, no matter how complex, is made up of simple sinusoids. Fourier Transformation is how you find those.

vnitin @dweinberger every physical phenomenon can be viewed as existing in space+time or vibrations+energy. FourierTrfm converts view1 -> view2

_eon_ @dweinberger think of waves on the ocea

[Tags: ]