Lexicography

New Resource for Septuagint Vocabulary

Just a brief post here to mention the publication of an excellent new resource for the Septuagint studies community. Just last month Eisenbrauns published No Stone Unturned: Greek Inscriptions and Septuagint Vocabulary (CSHB 5).

Of course, I am somewhat biased in this particular instance, as the author is my supervisor, Jim Aitken. (And no, he is not paying me to do this post). But if you are interested in LXX studies and have not seen this book, you will want to pick it up. At just $26 (here), it’s a great bargain.

Septuagint Vocabulary

I have posted a few times in the past on various matters in LXX studies that have overlapped with the issue of vocabulary. Most notably is the first two posts in my series discussing the approach of modern language translations of the Septuagint (here and here). As I mentioned, there is ongoing discussion among Septuagintalists regarding just how a LXX word is to be defined. Part of the reason that folks differ on that issue is due to differing views on what the LXX actually is (or was meant to be at first), and to what extent that influences word meaning.

Inscriptions & Lexicography

The purpose of Aitken’s new volume, however, it to draw more attention form all parties to inscriptions as a primary resource. In the discipline of Greek lexicography, there are many rooms. Some of these are very heavily trafficked. Word usage and development is extremely well documented for sources like Classical works, the New Testament and related literature (Philo, Josephus, the Fathers). Other rooms, however, are quite dark and forgotten. That is certainly the case with inscriptions, which offer a range of vocabulary and registers from a variety of regions and over may centuries.

That is why inscriptions are so important, and why it is so unfortunate that they have largely been overlooked in the lexicographical enterprise (Another reason being the relatively recent discovery of many of them). Of course, there are major difficulties in dealing with inscriptions, and those wishing to incorporate data from them into their research (such as myself) will have do much of the work de novo. Inscriptions are published in specialized and scattered volumes (with obscure commentary, often in German or Italian), are rarely translated, and employ difficult and fragmentary Greek.

Fortunately, the wonderful opportunities that these challenging primary sources offer are now somewhat more accessible with Aitken’s new book. It helpfully (and briefly!) describes recent discussions in LXX vocabulary and Greek lexicography in general, explains in detail why inscriptions are important, and then describes how to do the work of using them. Grab a copy!

LXX Translations Part I: NETS

In a previous post I announced a new ‘series’ in which I will outline the various principles and procedures involved in the current modern language translations of the Septuagint. Apologies that it has taken me so much time to get back around to working on this series. In the time since I wrote the initial post, we had our second child and moved from Philadelphia to Cambridge, where I have just begun doctoral work. This left me short on time and without any of my books!

But now, back to business.

I mentioned four modern translation projects of this sort – some finished, some still in the works. The first and, for many, most relevant of these is the most recent English translation. While there are older versions (namely Thomson [here] and Brenton [also here]), NETS is your best bet, generally speaking, for a “good” English rendering. As I mentioned, NETS is also accessible freely online (copyright), although owning a hard copy is well worth the cost since you’ll be referring to it so often. You can also purchase it for Accordance as well as Olive Tree.

Of course, just what constitutes a “good” translation is exactly the topic of this series. With that, I will dive into the approach of the scholars who produced NETS.

The New English Translation of the Septuagint

Each of the modern language translations of the LXX operate on the basis of assumptions about the nature of the LXX itself. What makes NETS unique among them is its understanding of the LXX as a subservient text to the Hebrew scriptures that came before it. Indeed, proponents of the NETS approach understand the Greek and Hebrew scriptures to have coexisted with equal importance, so to speak, after the Greek version was produced. This is not merely to say that both Hebrew and Greek texts existed, but that the Hebrew continued to serve as a religious text, rather than being supplanted by a new translation. The reason, in the perspective of the NETS group, is that the LXX was intended to serve as a pedagogical tool for Jewish students of the Hebrew text, which was always read alongside the Greek so that the two texts were best understood in conversation with one another.

The basic framework that NETS uses is called the Interlinear Paradigm. The genius behind this model is Albert Pietersma, who presents a conceptual school setting in which Greek and Hebrew texts were read in interlinear fashion. Pietersma states that the dependent status of the Greek version entails that “for the vast majority of Septuagint books this linguistic relationship can best be conceptualized as a Greek inter-linear translation of a Hebrew original within a Hebrew-Greek diglot” (“To the Reader of NETS,” xiv). “Conceptualized” is a key word, as he quickly clarifies that the term “interlinear” or “diglot” is merely a visual aid of sorts to help conceptualize the linguistic relationship. He is not proposing (or denying) there were actual interlinear Greek/Hebrew texts circulating among Jewish students.

Not like the NIV

Not how NETS envisions the LXX

In short, NETS take the approach, unique among its peers, that the intention of the LXX was to bring its reader to the Hebrew text, rather than bringing the Hebrew text to the reader. In other words, the point of translating the Hebrew scriptures was not to help Greek-speaking Jews comprehend a text that had become arcane and difficult to them in the Hellenistic era by producing a comprehensible translation (much like, say, the NIV is intended to help the ‘average Joe’ read the bible with ease). Rather, it was to help Greek-speaking Jews retain religious (and scholarly?) access to ‘the original,’ the Hebrew scriptures themselves (much like, well, an interlinear Greek New Testament). That was the intention of the LXX as produced.

Pietersma suggests this model has explanatory power. Most significantly, it accounts for the characteristic of “literalism” in most LXX books’ translation – that is, in many or even most cases (depending on the book) the Greek words match roughly one-to-one with the Hebrew words (MT). This is what is called the “constitutive character” of the LXX, and one often sees the word “isomorphism” or “isomorphic” in the relevant literature to describe this translational pattern. Importantly, interlinearity is meant to account for the many places where the Greek phrasing of the LXX is difficult to understand as Greek, or without reference to the Hebrew. If the Greek is awkward, so the logic goes, this is best explained by the assumption that the translator was attempting to mimic the Hebrew text as closely as possible, without regard (or with little regard) to the style or literary acceptability of his text in the Greek cultural environment.

Production and Reception

To bring this whistle-stop tour full circle, this methodological approach has practical implications upon the English version. NETS scholars will go on to say that the Jewish community eventually read the LXX independently, as a “received” text, despite its difficulty to understand. In similar fashion, the approach taken by the NETS team was to clarify the Greek text by referring to the Hebrew text, rather than attempting to puzzle it out as a (hypothetically) intelligible text qua Greek text when they themselves were translating the LXX.

Some of the NETS team

Put another way, in view of their understanding of how the Greek translation functioned, NETS translators aim to have their English translation (of the LXX translation) function the same way that they imagine the LXX translation functioned. Namely, so that users “should be able to utilize it [NETS] … in a comparative study of the Hebrew and Greek texts, albeit in English translation” (ibid, xv). This is meant to be the case both quantitatively (i.e. word-count) and qualitatively (i.e. style and tone).

So hopefully you can see how one’s understanding – or assumption – of what the LXX is, how it was meant originally to function as a text, has a profound influence upon how a modern translation is produced. It is necessary to have some assumption like this, although as we’ll see in future posts, there are at least three other possible assumptions for Septuagint origins, and these are distinct enough from that of NETS to amount to a fairly different product.

If you want to dive in further, I recommend listening to this excellent interview by T. Michael Law, which features top Interlinear Paradigm proponent Benjamin G. Wright explaining the methodology in detail (my apologies for the gaudy musical introduction).

Next time I will deal with La Bible D’Alexandrie, the ongoing French translation project, so stay tuned!

(A very brief) Bibliography

A. Pietersma, “To the Reader of NETS,” in A New English Translation of the Septuagint (pdf available here).

J. Joosten, “Reflections on the ‘Interlinear Paradigm’ in Septuagintal Studies,” in Scripture in Transition (Leiden: Brill, 2008) (pdf available here)

More freely available here

The “Annoying Little Words” & Exegesis – An Interpretive Lexicon

This is the second post out of two (see the first here) describing my recent, co-authored publication An Interpretive Lexicon of New Testament Greek (here). In the first I described the “interpretive” and “lexicon” aspects of the book. Here I want to focus on what I think is the best feature of it, and why it’s an exegetical golden goose. Let me preface much of this by saying that our “Introduction” in the Lexicon covers more detailed material that will also be helpful.

This post is a bit technical and won’t have many pictures, so strap on your thinking cap.

The Significance of the “Annoying Little Words”

I began to talk about function words in the first post. These are the words that students usually think of as quite annoying. For the most part, that is correct, since these words rarely have a neat definition that can be slapped on the back of a flashcard. The reason is that their whole raison d’être is to connect larger ideas (typically clauses but also paragraphs and other larger units of text). This basically means that the annoying little words are “multivalent” or “polysemous”. That is to say, they often take one of two or more possible meanings, depending on their context. And of course, since they are “function” words after all, the meaning they take in context will greatly affect what they doOkay, so that was abstract. Let’s get textual. Look at the fancy graphic above that I made. It shows a ‘cloud’ of the most frequently used words in the book of Romans. Notice how the obvious candidates like χάρις (‘grace’) or δικαιοσύνη (‘righteousness’) or νόμος (‘law’) are not immediately visible. The most prominent words are … you guessed it, the annoying little words. You get a gigantic καί and a δέ, a γάρ, a few definite article forms, and a few prepositions (διά, εἰς, ἐν). In fact, the one and only content word that is fairly visible is the genitive form of θεός (‘of God’).

My point is that you can only get to the “big ideas” of a book like Romans – or any text – by first going through the little words. They are absolutely indispensable to communication, slippery as they are to pin down to a single definition. Fortunately, we use function words automatically in our everyday speech and never give it a second thought. Unfortunately, this can make it all too easy to overlook their incredible importance in the task of interpretation.

Discourse Analysis

An English Example

To do some of the heavy lifting of dealing with function words in interpretation, some undertake a process that many call ‘discourse analysis,’ although it goes by other names as well (e.g. ‘text linguistics’). What this process aims to do is discern the larger structures and connectedness of a text. Remember that function words are sometimes called “connecting” words. They connect two (or more) larger chunks of text. As a result, if you want to determine the connection between Thought ‘A’ and Thought ‘B’ then you need to understand the function words that relate them.

Take the previous sentence for instance. It is made up of two main clauses:

1) you want to determine

and

2) you need to understand

Somehow, the two actions – 1. determining and 2. understanding – are related logically in that sentence. And the way they are related is by the two function words if and then. The first clause (wanting to determine) is conditional upon the second clause (needing to understand). This may seem obvious, but the point is that the words ‘if’ and ‘then’ manifest the conditional relationship between these two clauses, and therefore help the reader or listener ‘exegete’ this bit of communication.

But there is another important part of that sentence: The very first part, “As a result ” What we have here is a phrase – a syntactical construction – that serves as a road sign to the logic of the larger text. Linguists sometimes call this a ‘discourse marker’ (among other things). What the ‘as a result‘ phrase does is link that sentence to the one that precedes it logically. In essence, the idea is “A is such, therefore B is such.” The ‘B’ aspect is a result of the A aspect.

Getting Greeky

Let’s have a look at Romans 11:23:

And they also, if they do not continue in their unbelief, will be grafted in; for God is able to graft them in again” (NASB).

κἀκεῖνοι δέ, ἐὰν μὴ ἐπιμένωσιν τῇ ἀπιστίᾳ, ἐγκεντρισθήσονται· δυνατὸς γάρ ἐστιν ὁ θεὸς πάλιν ἐγκεντρίσαι αὐτούς.

I have boldfaced the (main) function words in the sentence. Note that the first one, ‘and’ is a conjunction that ties this sentence to the one that precedes as a coordinate idea. Then there is a(n implicit) conditional clause with the ‘if’ statement, so that the notion is ‘if they do not continue in their unbelief, then they will be grafted in.’ Finally, the rationale that grounds this statement is provided in the next clause and introduced by the word for: “for God is able …”

The Logical Main Point

All of this may seem pedantic. But there is a payoff. Language has what scholars call ‘semantic structure.’ That is to say, there is an ‘architecture,’ to so speak, of any communication (written or otherwise) that makes it understandable. As with a building, a well-constructed piece of writing or speech has a solid frame. Instead of steel beams, however, language uses what we might call semantic logic. It is important to realize that the presence of function words like ‘because’ or ‘therefore’ does not produce logical structure, but manifests it. In other words, the connecting words are there because language has semantic structure, not the other way around.

Here’s proof. In the Rom. 11:23 example above I mentioned that there was an implicit conditional clause. That is because the “second half” of a conditional – the word then – does not actually appear in the text. It is implied. And yet as readers or hearers the conditional sense is understood nevertheless. This applies to other logical relationships as well. For example, I can say “I’m not going outside. It’s cold” and you understand perfectly that the second statement is the reason for the first, and could be connected by the word because for the same effect. The logical structure is there whether or not the words are there to point to them. (Also note that one could not put a ‘therefore’ between those two clauses without producing nonsense; only some logical relationships are possible in a given context).

The Interpretive of the Lexicon (Again)

Bringing this all the way back around to the Interpretive Lexicon, as I alluded to in my first post, we use a system of letters and symbols to key the reader into the logical relationship – the discourse-level function – of the word being discussed. Again, these words are often multivalent and can be taken in several ways depending on context. That is where our lexicon comes in, to help the reader swiftly narrow down the possible logical relationships of a word (or phrase) in Greek, and therefore to better (and more quickly) understand the text.

To conclude, here is the set of our relationships included in the Lexicon. We also include an extended section carefully defining each one and providing an example. We have also aligned our own logical relationships with those used at John Piper’s online site BibleArc.com in order to maximize their compatibility. It is our great hope that it can be used to help pastors, students, and scholars as well as each one reads and interprets the Greek scriptures.

abbrev

G. K. Beale, Daniel J. Brendsel, and William A. Ross, An Interpretive Lexicon of New Testament Greek (Zondervan, 2014), 23.